The United States' role refers to the country's involvement and influence in international affairs, especially during the late 19th and early 20th centuries as it emerged as a world power. This period marked a shift in U.S. foreign policy from isolationism to a more assertive presence on the global stage, characterized by military interventions, economic expansion, and diplomatic initiatives. The U.S. sought to project its influence through various means, including territorial acquisitions and participation in international conflicts.