Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How do you deal with it straight up lying? My problem with this whole system is, if I’m asking those questions it’s because I don’t understand the field well enough to answer it myself, which means I can’t pick up on if ChatGPT is lying…



Fair, but not completely true. The Thailand examples gives a detailed reasoning. You can use those building blocks to check. If it says Thailand is a cold country and uses that in its argumentation, it's shaky. You don't have to be an expert climatologist to make this judgement.

It's not just one clean answer and we're done. In my experience it is helpful in breaking the problem down into stuff you can Google.


> In my experience it is helpful in breaking the problem down into stuff you can Google.

Yeah I can see that being useful. I’ve also seen a lot of non-technical people straight up accept whatever comes out of it, so that’s a little worrying. It’s true of Google searches too, of course, but at least a google search gives N results someone can check rather than 1.


They straight up accept until they discover first mistakes :)


Fact check with google.

With the example questions I provided, it would take many hours to do research on the subject. GPT provided initial answers instantly, and then fact checking was easy.

That’s what we did with gpt-3. With plugins you can have gpt fact-check itself.

Also, if you have a system for dedicated knowledge, you can use embeddings - with embeddings gpt has very little room for hallucinations, and it can provide detailed references.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: