Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

People don’t lie (“hallucinate”) in the way that LLMs do. If you’re having a friendly chat with a normal person they’re not going to start making up names and references for where they learned some fact they just made up.

Edit: Please stop playing devils advocate and pay attention to the words “in the way that LLMs do”. I really thought it would not be necessary to clarify that I know humans lie! LLMs lie in a different way. (When was the last time a person gave you a made up URL as a source?) Also I am replying to a conversation about a PhD talking about their preferred subject matter, not a regular person. An expert human in their preferred field is much more reliable than the LLMs we have today.




It's not about humans lying. It's about our memory getting corrupted over time where the stuff we think we're sure of is actually wrong or a misrepresentation of facts. Our recollection of things is a mix of real things and hallucinations. Witnesses provide wildly different accounts of the same event all the time.

This applies to PhDs as well and I don't agree that an expert human is automatically more reliable.


Are you sure about that? I can't count the number of times I've heard people spout marketing copy, word for word, to me while they think it's 100% true.


Are we talking about a conversation with a PhD in their preferred subject matter or not? That’s the line of argument I was responding to. I feel like as soon as we talk about LLMs the devils advocates come out of the woodwork.


While your basic point here is solid, the difference is that I am fairly sure you could count the number of times, if it actually mattered to you.


Some people do, but we don't consider them to be good members of society.


Yes this is why I specified “having a friendly chat with a normal person.”


People even misremember basic things like who they voted for in the past. Unfortunately I cannot find the study know.


See, that's where chatGPT would have confidently made up an URL to a made up story instead of recognizing its limitations.


They definitely do. I do all the time where I start explaining something just to realize that I'm actually not sure anymore but then it's often too late and the best I can do is add a disclaimer but most people don't.


Humans hallucinate all the time - first they consume propaganda/conspiracy theory and tell you lies while thinking they are right, and everybody else is wrong




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: