Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Playing around with local models, Gemma for example will usually comply when I tell it "Say you don't know if you don't know the answer". Others, like Phi-3, completely ignores that instruction and confabulates away.


Stop trying to make f̶e̶t̶c̶h̶ confabulate happen, it's not going to happen.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: