NoMereRanger
Member
- Joined
- Jul 7, 2018
- Messages
- 173
- Reaction score
- 58
- Location
- VA, USA
- Gender
- Male
- Political Leaning
- Very Conservative
Have had this discussion with a female friend of mine before: She contends that most of history demonstrates the existence of an all-powerful patriarchy. Men have the final say in everything, men physically impose themselves on women, men have all the fun. I contend that evolution has enabled both genders with specific attributes that lend themselves to our survival. From there we developed the best ways to do things and thus became the dominate species on this planet. I don't deny that sexism has always existed and will most likely exist for as long as we do. I only posit that, in general, men tend to gravitate towards certain roles and women towards others because we realized that it was best for the continuation of our species. For example, men tend to be more assertive and bigger assholes, which make them better leaders. Women tend to be more empathetic and understanding, making them better caretakers. All this is a long winded way of saying "There are gender roles." Right or wrong?