Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Have you seen jailbreakchat.com yet? You can get around those guardrails on ChatGPT by having it role-play as a different chat bot. Not that I view this as some sort of long-term solution to restricted output, but just thought it was interesting and kinda freaky how it will take on a persona you give it.



They claim GPT4 is > 80% less trickable in that regard




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: