I've been using ChatGPT a lot. I'm no longer feeling much anxiety over the future of software engineering. GPT is a great tool. Still just a tool. It helps remind me how to do things I do rarely. To get code that exactly fits my requirements, I'd have to get so specific that I might as well just write the silly code myself. But for shell scripts, SQL, stuff like that, it's pretty decent.
Yesterday I asked it to give me a postgres SQL query to do something I don't often do, but I had a pretty good idea of what it would be. It confidently lied. I replied with "that is plausible but incorrect, I think you need to use X function instead" and it actually said "You're right, I'm sorry, that is the correct way to do X, here is an example". I laughed pretty hard at the casual apology followed instantly by the actually correct result. I'm in no danger of anthropomorphizing an LLM, but still.
Yesterday I asked it to give me a postgres SQL query to do something I don't often do, but I had a pretty good idea of what it would be. It confidently lied. I replied with "that is plausible but incorrect, I think you need to use X function instead" and it actually said "You're right, I'm sorry, that is the correct way to do X, here is an example". I laughed pretty hard at the casual apology followed instantly by the actually correct result. I'm in no danger of anthropomorphizing an LLM, but still.