Feminism in the United States refers to a range of movements and ideologies aimed at defining, establishing, and defending a state of equal political, economic, cultural, and social rights for women in the United States.
Quotes
- Feminism in the United States has never emerged from the women who are most victimized by sexist oppression; women who are daily beaten down, mentally, physically, and spiritually-women who are powerless to change their condition in life. They are a silent majority. A mark of their victimization is that they accept their lot in life without visible question, without organized protest, without collective anger or rage.
- bell hooks, Feminist Theory: From Margin to Center (1984), Chapter 1: Black Women: Shaping Feminist Theory, p. 1.
- Women of color are mirrors in which white women are supposed to see themselves but, instead, see themselves as no other mirror can show them—as selves that are plural and who instead of being righteous, moral beings, are also participants and perpetuators of a racist system.
- Mariana Ortega, "Being Lovingly, Knowingly Ignorant: White Feminism and Women of Color" Hypatia, vol. 21, no. 3 (Summer 2006)
See also
External links
This article is issued from
Wikiquote.
The text is licensed under Creative
Commons - Attribution - Sharealike.
Additional terms may apply for the media files.