I dislike the notion that masculinity is deemed toxic where femininity is encouraged. In addition I don't believe there is a rape culture in the West as it is discouraged and punished by our society.
The purpose of this thread is to gather the opinions of others to better understand these notions. Why is masculinity called toxic? Why do people say there is a rape culture in the west (in particular in college)? Do rapists truly not know that rape is bad and teaching men not to rape would make a difference? Sounds like teaching a thief not to steal, but if I'm missing something please inform me.
The purpose of this thread is to gather the opinions of others to better understand these notions. Why is masculinity called toxic? Why do people say there is a rape culture in the west (in particular in college)? Do rapists truly not know that rape is bad and teaching men not to rape would make a difference? Sounds like teaching a thief not to steal, but if I'm missing something please inform me.