You can fix that by not clicking on things you don't like, because it shows you more of what you click. You can end up with a feed with fascists or communists or both because free speech applies to all. But again, you can choose what you want to see by training the algo or by using the feed of who you are following.
The problem with those assertions is that it didn't use to show me any of this content. It used to show me content I was reasonably happy with. Then the content showing up for me very rapidly changed towards including a huge amount of extreme right-wing content and bigotry over a period that I was hardly active, and only used the "Followers" tab and clicked on content I enjoyed from people who don't in any way fit the profile of the type of accounts I complained about earlier.
At best the quality of their recommendation system has dramatically declined. At worst they're intentionally favouring different stuff. In between there are slightly more palatable (than the worst case) options, such as that they're favouring overall popularity of content more over your individual preferences, but in any case all I know is that it's turned from a relatively pleasant experience to pushing content in my face that disgusts me.
I've also written recommendation systems. In fact, I've written one I used to run on top of Twitters API. And so I know from first-hand experience that it's trivial to get better quality recommendations more aligned with what I click on than what I'm currently getting. Something is seriously wrong, whether accidentally or intentionally.