This is already possible with search engines, there is enough information on the internet that you can substantiate just about any claim regardless of how much evidence there is to the contrary. (see flat-earth, plenty of plausible sounding claims with real, albeit, cherry picked evidence).
Yes of course, this is already possible with AI writing assistance as well, if you're willing to plug in some of the phrases they come up with into a search engine to figure out where they may have come from. But you still have to do the work of stringing the arguments together into a cohesive structure and figuring out how to find research that may be well outside the domains you're familiar with.
But I'm talking about writing a thesis statement, "eating cat boogers makes you live 10 years longer for Science Reasons" and have it string together a completely passable and formally structured argument along with any necessary data to convince enough people to give your cat booger startup revenue to secure next round, because that seems to be where all these games are headed. The winner is the one who can outrun the truth by hashing together a lighter weight version of it, and though it won't stand up to a collision with real thing, you'll be very far from the explosion by the time it happens.
AI criticism is essentially people claiming that having access to something they don't like will end the world. As you say, we already have a good example of this and while it is mostly bad and getting worse it's not world-ending.