Sunday, March 6, 2011

US Imperialism

Has the US been Imperialistic?
-Yes, ever since the Revolutionary war, the US has been imperialistically acquiring new land and expanding control over other areas. However, in several cases, US imperialism has been a force for good.


Pro:
1.The Indian Removal Act
2.After the Spanish-American War, the US readily took Cuba, Puerto Rico, and the Philippines.
3.The takeover of Hawaii.

Con:
1. The US gave Cuba and the Philippines their independence when they wanted it.
2. America's very foundation is premised upon breaking the yoke of the British, imperialist Empire.
3. America did everything in its power to stay uninvolved in the affairs of Europe up until it was thrust in to war against, what any sensible person would regard as a true imperialist dictatorship, Nazi-Germany.

No comments:

Post a Comment