Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can discuss something kosher and have the LLM misinterpret it as something sexually explicit. Yours or their logs will now have all of this miscommunication, and this is a liability. Using models that can’t generate this content even by accident is good legal decision for many. Same goes for images. Stay safe!


> you'll have to do that locally

The Gemma family is a family of local models!




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: