Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why is this so problematic? You can read all this stuff in old papers and patents that are available in the web.

And if you are not capable to do this you will likely not succeed with the chatgpt instructions.





I’m not saying it’s not possible to get this information elsewhere - but it’s impossible to prevent ChatGPT from telling you how to do illegal stuff; something that the model explicitly should not be able to according to its makers



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: