11-30-2017, 06:18 PM
(11-30-2017, 09:50 AM)GMDino Wrote: So is there a culture of rape in America?
Or do all of these sexual assaults and harassment coming out years later mean we don't cover these things up for powerful people?
No, there is not a culture of rape in America. But, apparently there is in Hollywood and American politics.