Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We had this article the other day[1] about how multiple LLMs can hallucinate about the same thing, so this is not guaranteed to remove hallucinations that are caused by poor or insufficient training data.

[1] https://news.ycombinator.com/item?id=43222027



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: