Not in my experience. Then again, I am a female raised before "feminism" existed. I saw subtle hints of it even as a teenager, because all the girls I knew expected nothing more than to find a husband and pop out babies. It never occurred to them that they could be anything else. Every mom in my neighborhood was a housewife; every dad went to work. In my own family, I wanted to go to college, but was told by my father that he wasn't wasting money on his daughters, who would just waste their education by getting married. My brother, however, was a different matter.
Years later, after doing what was expected of me, I ended up divorced, raising two children with nothing but a high school diploma. I got crap jobs doing secretarial work and answering phones, dodging grab-handed engineers and spending so much on child care I couldn't pay the mortgage.
Fast forward, I was bright, a quick learner, and worked my way up to a mid-management position as finance director (because of recently enacted Affirmative Action quotas put in place by the newly formed "feminism" movement, to be honest) only to discover that females were dismissed, denigrated, ridiculed by not only the males who reported to her, but those on "equal" mid-management footing and her superiors as well.
So if you mean "pitting women against men" for equal opportunity, equal wages, equal respect was a bad thing, a "trick", then I vehemently disagree. I've been there. Things are much better now, but there is still work to do. Your post actually proves it.