AP Spanish Literature
American imperialism refers to the policy and practice of extending the United States' influence and control over other countries and territories, particularly in the late 19th and early 20th centuries. This era marked a significant shift in U.S. foreign policy, as it sought to expand its economic and military power abroad, often justifying these actions through notions of superiority and a desire to civilize other nations.
congrats on reading the definition of American imperialism. now let's actually learn it.