Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Somewhat opposite - if LLMs continue to perform like that with made up information and such, their credibility would erode over time and a defacto expectation would be that they don't work or aren't accurate which would result in less being reliant on them.

Same like self driving cars didn't have a mainstream breakthrough yet.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: