Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think LLMs are evil either, but I think the real risks are extremely underplayed. This is a mostly innocuous example, but there are a lot of people trying to get LLMs into more places where the just aren't ready for yet.

The difference between a template is that the behavior is generally deterministic. Even if someone fucks it up, it means it's (usually) trivial to fix.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: