Tuesday, December 20, 2011

Why did the western powers want to take over territory in the late 1800's?

In 1875, and after that, Europeans started bombarding into Africa, etc. Why did they suddenly want to do this? Also, I don't understand from my textbook what the United States did in the period of Imperialism, and what territory they won over. I know that the Europeans basically took control of India and Africa, etc.

No comments:

Post a Comment