Honors US History
Radical feminism is a branch of feminism that emphasizes the need for fundamental societal change to eliminate patriarchy and achieve gender equality. This approach seeks to address the root causes of women's oppression by advocating for the dismantling of existing social structures and norms, often viewing patriarchy as an overarching system that influences all aspects of life. Radical feminists highlight issues such as sexual violence, reproductive rights, and the importance of women's autonomy as central to their activism.
congrats on reading the definition of radical feminism. now let's actually learn it.