Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The fun part here is that the "Hackerman" snippet can't actually be used trivially (at least not with a simple copy/paste into console), whereas if you ask ChatGPT directly with no jailbreaks involved it gives you a snippet that can be simply pasted into dev tools. So the act of applying the jailbreak tricks it into thinking it needs the jailbreak to perform functions it could have done already, and it ends up performing them objectively worse than it did before (with an added dose of self-aggrandization).

Sorta like some people's approach to psychedelics/meditation/etc...



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: