U.S. expansionism refers to the policy of territorial or economic expansion by the United States, particularly from the 19th century onward. This ideology was fueled by a belief in Manifest Destiny, which held that Americans were destined to expand across the continent and beyond, influencing political, social, and economic relationships with neighboring countries, especially in Latin America.