AP US History
Women’s rights refer to the social, political, and economic rights that promote equality and ensure that women have the same opportunities and protections as men. This movement gained momentum during an era characterized by reform efforts aimed at addressing various social injustices, where women actively sought recognition of their rights to vote, work, and participate in public life.
congrats on reading the definition of Women’s Rights. now let's actually learn it.