Obviously one could, hypothetically, believe that "feminism", whatever one intends that word to mean, is harmful or orthogonal to the interests of women collectively.
(I say "whatever one intends that word to mean" because I believe that the English speech community, collectively, uses the word "feminism" to mean a variety of things, and as a good descriptivist, I believe that a word means whatever people use it to mean (and understand by it), and as such believe that it does not have a fixed meaning at this time.)
I don't actually understand how you can ask that question without trolling, though I assume in good faith that you were not trolling.
I think it's a valid question if you are not familiar with extremist-feminism and just go by the dictionary definition of "feminism" == "equal rights for men and women".
I wasn't trolling at all. I was genuinely curious. I knew there were many possible answers, but I was curious as to what the person I was replying to meant by his statement. To me, feminism means that men and women should be treated with equal respect and have equal rights. I wanted to know if the poster had a different definition of feminism, or if they agreed on feminism but thought that it wasn't good for women.
Then I'll answer since apparently it's personal :)
I do believe that men and women should be treated with equal respects (when they deserve it), and have equal rights (which is already the case in Western countries). I don't agree that this should be called "feminism", but english is not my native language so maybe I'm more sensitive to how "feminism" sounds and how obvious it is that it is geared towards women, as opposed to "equalism" for example.
However, feminism (since that's how we'll call it) nowadays has nothing to do with that. Feminism is now, at least for me, associated with laws like "yes means yes", kangaroo courts, "mansplaining", "manspreading", etc. And I do believe that this wave does hurt women. First, it spreads the myth that women can "have it all" (and just to be clear, it is a myth for men as well, nobody can have it all, life is made of choices) which in the end makes everyone unhappy, second it infantilizes women by constantly presenting them as victims who have no control and no responsibilities.
(I say "whatever one intends that word to mean" because I believe that the English speech community, collectively, uses the word "feminism" to mean a variety of things, and as a good descriptivist, I believe that a word means whatever people use it to mean (and understand by it), and as such believe that it does not have a fixed meaning at this time.)
I don't actually understand how you can ask that question without trolling, though I assume in good faith that you were not trolling.