U.S. interventionism refers to the foreign policy approach where the United States actively engages in the affairs of other countries, often through military, economic, or political means, to influence outcomes in favor of its interests. This strategy became especially prominent in the late 19th and early 20th centuries, with the Spanish-American War marking a significant turning point, as it led to U.S. territorial expansion and the establishment of American influence in Latin America and the Caribbean.