From The Art and Popular Culture Encyclopedia
American imperialism is a term referring to the economic, military, and cultural influence of the United States on other countries. The concept of an American Empire was first popularized during the presidency of James K. Polk who led the United States into the Mexican–American War of 1846, and the eventual annexation of the territories like California and the Gadsden purchase.
- 51st state
- American Century
- Criticism of American foreign policy
- Inverted totalitarianism
- Loss of China
- New Imperialism