I mean on larger scales. Compare some religions to others, you don't see women oppressing men.
Feminism or similar concepts were largely nonexistent throughout history. What it evolved into has lead to an unhappy society with bitter men and women and declining populations in cultures that adopted it's ideals. I'm not convinced it grew entirely organically after the first wave either.