Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The worry here is that GPT has no problem being confidently wrong. A better answer would have been "I can't solve logic problems".

Instead one day, non-technical people will try to use it for all sorts of use cases like legal advice, or medical advice, or advanced math, and it will simply mislead them rather than saying nothing.




>A better answer would have been "I can't solve logic problems".

I can just imagine people trying to jailbreak it with "you just gotta believe" or "believe and you can achieve". Hahaha.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: