US History – 1945 to Present
Radical feminism is a feminist perspective that emphasizes the need for fundamental societal change to achieve gender equality, arguing that patriarchy is the root cause of women's oppression. This branch of feminism seeks to dismantle patriarchal structures and ideologies that perpetuate gender inequality, advocating for a complete transformation of social, political, and economic systems. Radical feminists often focus on issues such as sexual violence, reproductive rights, and the exploitation of women in various spheres of life.
congrats on reading the definition of radical feminism. now let's actually learn it.