American Society
Gender roles refer to the societal expectations and norms that dictate how individuals should behave, think, and feel based on their perceived gender. These roles can influence various aspects of life, including career choices, family dynamics, and social behavior, often leading to differences in opportunities and treatment between genders. Understanding gender roles is crucial in examining how they contribute to gender inequality, which affects both personal identities and broader societal structures.
congrats on reading the definition of gender roles. now let's actually learn it.