Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Alls LLM have this risk but somehow nobody seems to care or they think they can order the LLM to stop with a better prompt.


If it was as simple as telling the LLM not to hallucinate every system prompt would just say "don't hallucinate" and we wouldn't have hallucinations




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: