Somewhat opposite - if LLMs continue to perform like that with made up information and such, their credibility would erode over time and a defacto expectation would be that they don't work or aren't accurate which would result in less being reliant on them.
Same like self driving cars didn't have a mainstream breakthrough yet.
Same like self driving cars didn't have a mainstream breakthrough yet.