Thank you for reading my post.
I've been thinking about feminism and more specifically I've been thinking about the ways in which a patriarchal society has influenced women to police one another. Women often try to keep other women "in check", how to dress, how to impress, how to pacify etc. by teaching or embracing conformity of a heteronormative culture developed by men, for men's pleasure. Oddly, in this patriarchy men seem to be the CEO's and women are the underpaid managers.
Does anyone know if there is a term for this construct?
Here are a few examples:
1.) Today I learned about a gentleman's club in England that denies membership to women, yet the majority of the behind the scenes staff is run by women.
2.) When women and girls encourage one another to downplay their intelligence in order to make themselves more attractive to men/boys.
3.) When women and girls encourage one another to go to great lengths to change their physical appearance to satisfy men (spend what income they have on plastic surgery, non-surgical cosmetic treatments, anti-aging products, makeup etc.), but may offer less support and encouragement to one support one another with regard to standing up to men by asking for equal rights.
4.) Situations in which women convince other women to participate in pyramid schemes by selling beauty products to their friends and recruiting them to be part of the company. Sometimes these businesses lead women and the women they recruit, further from prosperity and toward a financial loss.
My point, is that women are consciously and/or unconsciously working for patriarchy by teaching/coaching/reinforcing/policing/shaming/encouraging one another to be a man's ideal woman rather than a woman's ideal woman.
Is there a term for this?
Say hello and shake our hands
2 posts • Page 1 of 1
Hi there sparklehoof, would you mind introducing yourself first? If you're at a loss as what to post, there is guidance in the Sticky at the top of this forum, Guide for New and Returning Members.
Ugh, I'm tired of my signature.