Right, but LLMs are also consuming AWS product documentation and Terraform language docs, some things I have read a lot of and they’re often badly wrong on things from both of those domains, which are really easy for me to spot.
This isn’t just “shit in, shit out”. Hallucination is real and still problematic.
This isn’t just “shit in, shit out”. Hallucination is real and still problematic.