AP US History
The Post-World War I era refers to the period following the end of World War I in 1918, characterized by significant social, political, and economic changes across the globe. This era saw the emergence of new international tensions, shifts in power dynamics, and the redefinition of national borders, influencing both foreign relations and domestic policies, including the rise of fear and suspicion that manifested in domestic events.