Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is completely true but completely in conflict with how many very large companies advertise it. I’m a paid GitHub Copilot user and recently started using their chat tool. It lies constantly and convincingly, so often that I’m starting to wonder if it wastes more time than it saves. It’s simply not capable of reliably doing its job. This is on a “Tesla autopilot” level of misrepresenting a product but on a larger scale. I hope it continues being little more than a benign or embarrassing time-waster.


One of the only text written in chatgpt's own website that comes each time when you chat with it is "ChatGPT may produce inaccurate information about people, places, or facts."


That's a lot of words for just saying ChatGPT's answers can't be trusted.


Where does Github misrepresent their Chat beta? On their marketing website?


Right from https://github.com/features/preview/copilot-x:

“Context aware conversations with your copilot. If you're stuck solving a problem, ask GitHub Copilot to explain a piece of code. Bump into an error? Have GitHub Copilot fix it. It’ll even generate unit tests so you can get back to building what’s next.”

This is almost a Homer Simpson running for garbage commissioner level of over-promising. I think Copilot is an incredible tool, what’s possible right now is amazing and it can save time and offer value. But the degree to which it doesn’t just fail but completely misdirects is at serious odds with the breathless marketing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: