Topic is in the title people. How has the United States's SOCIAL system (not legal) affected our society's views on sexism and gender roles? If you don't live in the US then how has your county's social system (again not legal) affected your society's views on sexism and gender roles in general (please specify your country)?

Tags: female, gender, male, roles, sexism

Views: 102

Reply to This

Replies to This Discussion

Well, I'm from Germany.
Ever since the 60s and 70s it is basically a sin not to think of men and women as equals here.
Even though in the 70s it was the other way around (treat man more woman-like) we are pretty far in the process.
Nowadays it goes so far, that it is pretty much a cardinal sin to say anything opposing the thought that men are oppressing women everywhere and they need to be freed, which is just so wrong.
Even studies about the subject (e.g. about unfair wages) are represented misleadingly by the media, suggesting women are mistreated even though they are not anymore. They even have huge (social AND legal) advantages over men in most fields, which kinda grinds my gears. (a social example is below, a (randomly chosen) legal example is, that it's not a felony for women to go out on the street naked, but it is for men (seriously!))

For example, my professors are either so afraid of treating them wrong or so attracted to their mini-skirts, that they basically give out good grades for free to them. Where as I and my male colleagues have to work our asses of. Seriously, I know that sounds exaggerated, but it really is extrem here.


Youtube Links!

Here are some YT links to channels related to Nerdfighteria and educational content!

*Can you think of any more? Pass along any suggestions to an Admin who will then add it to this list should it fit!

© 2015   Created by Hank Green.   Powered by

Badges  |  Report an Issue  |  Terms of Service