Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Exactly, while there is some merit to the article's argument, I don't think it holds any water in this case. GPT4 will lie to you [0] and you need the human to figure out when that's the case.

[0] https://simonwillison.net/2023/Apr/7/chatgpt-lies/




> you need the human to figure out when that's the case

Do you? I mean, you and I think you do in order to do a good job, but does that hold water? The last decade or so of technological "progress" in the commercial space should leave few illusions about the willingness to obstruct processes and damage the interactions between people to save a few bucks.

As tech gets more capable, the human interactions with it are getting worse. And if you hate phone trees now (I do!), GPT4's contribution is adding ever deeper, infinitely patient tools for stalling you when you have to call for things.

Hell, we already have plenty of stans in the HN comment sections actively claiming that no, of course you shouldn't hold LLM purveyors responsible for their hallucinations, the desired universe is liability-free! If you've already bought into societal enshittification for a buck and there are no penalties, why would you worry about being right?


> you need the human to figure out when that's the case.

Yes, but you will need fewer people. And thas is with the current level of GPT. If the GPTs proficiency increases, fewer and fewer people will be needed. That’s a problem.


Try and reframe the same argument except with computers to see that it doesn't add up. When computers started becoming widely spread you could've easily made the same argument. It isn't a problem, it's a boon to make our lives easier.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: