Always thought about this, it's always pissed me off, but never actually posted it. Question is: Why do so many women feel it's their "God given" right to talk dirty to men, but when men talk dirty to these same women, it's automatically shunned and ridiculed, and they're labeled as pigs, sleazy, perverted, etc.? How did this mentality originate?