Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you believe there's a world "where OpenAI 'solves' jailbreaks," then you believe there is such a thing as software without bugs.



If it becomes as difficult as finding any other security bug OpenAI will have solved the jailbreaking problem for practical purposes.


You are considering it a security bug that a generalist AI that was trained on the open Internet says things that are different from your opinion ?


Of course not, how's it supposed to know my opinion? I'm referring to the blocks put in place by the creators of the AI.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: