I don't think LLMs are evil either, but I think the real risks are extremely underplayed. This is a mostly innocuous example, but there are a lot of people trying to get LLMs into more places where the just aren't ready for yet.
The difference between a template is that the behavior is generally deterministic. Even if someone fucks it up, it means it's (usually) trivial to fix.
The difference between a template is that the behavior is generally deterministic. Even if someone fucks it up, it means it's (usually) trivial to fix.