Post-World War I Germany refers to the state of Germany after its defeat in the First World War in 1918, marked by significant political, social, and economic turmoil. The Treaty of Versailles in 1919 imposed harsh reparations and territorial losses on Germany, leading to widespread discontent and instability that shaped the interwar period, creating a climate ripe for radical political movements and economic challenges.