American imperialism refers to the policy and practice of the United States extending its influence and control over other nations, territories, and regions, particularly during the late 19th and early 20th centuries. This expansion often involved military intervention, economic dominance, and cultural assimilation, resulting in significant impacts on the countries affected and shaping international relations.