Changing gender roles refer to the evolving expectations and behaviors associated with masculinity and femininity in society. During significant historical events like World War I, these roles underwent drastic transformations as women began to take on responsibilities traditionally held by men, such as working in factories, serving as nurses, and contributing to the war effort, leading to a re-evaluation of gender norms.