Feminism. It's basically the idea that you shouldn't be treated a certain way or have to do something just because of your gender (except for things like mammograms, prostate exams and so on.) This, if anything, we should be able to agree on, right? A woman shouldn't have to stay away from wearing pants or getting decent pay for decent work just 'cause she's a girl, and if a boy likes playing with dolls he shouldn't get picked on by all the boys who prefer toy cars, right?
And still we have women saying "I don't consider myself a feminist" which in my mind is like black people saying "I wouldn't say I'm anti-racist" or jews saying "I can't say I'm against anti-semitism." Sometimes, their motivation is "I don't see people as Men or Women, I just see them as people." Sounds nice, but won't stop injustice. It's not like getting lower pay for the same work is acceptable just because you "see them as people". Other times they don't agree with the extremists, and think that the man-hating extremists are what feminism is like, which is like thinking "All Christians want to shoot gay people, right? That's why, even though I believe Christ died for our sins, I wouldn't say I'm a Christian." (And the vast majority of us don't want to shoot gay people, me for one. My view of gayness is that it's really cool, and if God differs he'll have to accept that I have a different opinion than him and I consider him to be wrong.)