Women should hold a place of honour in healthy society. Women are a core essential to passing on human life, in a very natural basic sense. There must be a very good reason for that old saying “women and children first”.
But it’s hard to separate basic understanding of what we used to call “truth” from what comes across to me as an unspeakable horror. Any more, when I think of feminists, I think of their insistence that abortion is some sort of sacred sacrament. I’m not arguing here on whether it should be legal or illegal. I’m just pointing out that this violently abhorrent practise shouldn’t be something feminists should want to be first thought associated with. But there it is. So, when you complain about men staring at your tits, or whatever gripes you have about the “patriarchy”, think about what role you stand for in society. It’s not one of nurturing life, it’s quite the opposite. Why should anyone respect you, if that’s your core belief?