Gender roles refer to the social and behavioral norms that society considers appropriate for individuals based on their perceived gender. These roles dictate how individuals are expected to think, behave, and present themselves, often reinforcing traditional views of masculinity and femininity, which significantly impacts identity formation, societal interactions, and power dynamics.
congrats on reading the definition of gender roles. now let's actually learn it.