The vast majority of humans - even intelligent humans - do not "hallucinate like crazy."
Given a list of episode descriptions of Gilligan's Island, the vast majority of humans - even intelligent humans - would either be able to discern the correct answer or say they don't know.
I understand why there is this drive to present the normal human mental and psychological baseline as being just as unstable as LLMs, there is just too much money behind LLMs not to want to aggressively normalize its faults as much as possible (just as with the faults in autonomous driving), but any human being who hallucinated or confabulated with as much regularity as LLMs would be considered severely mentally ill.
> any human being who hallucinated or confabulated with as much regularity as LLMs would be considered severely mentally ill.
ie, it is common enough that we have a label for it. And the stats on how many people have a mental illness are not encouraging. If you put a little fence around the people hallucinating and dehumanise them then sure, humans don't hallucinate. The problem with that argument is they are actually still people.
>ie, it is common enough that we have a label for it.
Having a label for something doesn't imply that it's common. We have labels for plenty of rare things as well.
Also, "mental illness" is a far more broad category than what's being discussed, which is specifically symptoms that resemble the hallucinations and confabulations of LLMs, at the frequency with which LLMs display them. Most mental illness doesn't involve hallucinations or confabulations That is not common in humans, in LLMs it's normal.
>If you put a little fence around the people hallucinating and dehumanise them then sure, humans don't hallucinate.
I'm not dehumanizing anyone, this isn't a rational argument, it's just an ad hominem.
> The problem with that argument is they are actually still people.
The problem is that isn't the argument, and you can't attack the argument on its merits.
The simple, plain, demonstrable non-prejudiced fact is LLMs confabulate and hallucinate far more than human beings. About 17% to 38% of normal, healthy people experience at least one visual hallucination in their lifetime. But hearing voices and seeing things, alone, still isn't what we're talking about. A healthy, rational human can understand when they see something that isn't supposed to be there. Their concept of reality and ability to judge it doesn't change. That is schizophrenia, which would more accurately model what happens with LLMs. About 24 million people have schizophrenia - 0.32% of the population. And not even all schizophrenics experience the degree of reality dysfunction present in LLMs.
You are claiming that, in essence, all human beings have dementia and schizophrenia, and exhibit the worst case symptoms all the time. We wouldn't even be able to maintain the coherence necessary to create an organized, much less technological, society if that weren't the case. And you're claiming that the only reason to believe otherwise must be bigotry against the mentally ill. Even your assertion upthread, that "a lot of the most intelligent humans turn out to be crackpots" isn't true.
Stop it. Stop white knighting software. Stop normalizing the premise that it isn't worth being concerned about the negative externalities of LLMs because humans are always worse, and thus deserve the consequences. The same attitude that leads people to state that it doesn't matter how many people autonomous cars kill, humans are categorically worse drivers anyway. I can't think of many attitudes more dehumanizing than that.
> I'm not dehumanizing anyone, this isn't a rational argument, it's just an ad hominem.
Well, you lead with "The vast majority of humans - even intelligent humans - do not "hallucinate like crazy."" and then follow up by identifying a vast category of humans that do, literally, hallucinate like crazy. Unless you want to make an argument like mental illness actually being the appropriate mindset for viewing the world. Anyhow, you probably want to include an argument for why you think it is OK to exclude them.
Humans hallucinate continuously. If you test them in any way it is common to get nonsense answers. The difference is that it isn't polite to ask humans questions that expose the madness, people tend to shy away from topics that others routinely get wrong.
It is quite hard to explain a typical scholastic test without hallucinations. Particularly getting making mistakes in maths, spelling, and the sciences. It isn't like there is some other correct answer to a math problem that someone could be confused by; people just invent operations that don't exist when questioned.
> The simple, plain, demonstrable non-prejudiced fact is LLMs confabulate and hallucinate far more than human beings.
That isn't true, the opposite is true. Humans couldn't answer the breadth of questions a LLM does without making up a substantially more garbage. The only reason it isn't more obvious to you is because we structure society around not pressuring humans to answer arbitrary questions that test their understanding.
Given a list of episode descriptions of Gilligan's Island, the vast majority of humans - even intelligent humans - would either be able to discern the correct answer or say they don't know.
I understand why there is this drive to present the normal human mental and psychological baseline as being just as unstable as LLMs, there is just too much money behind LLMs not to want to aggressively normalize its faults as much as possible (just as with the faults in autonomous driving), but any human being who hallucinated or confabulated with as much regularity as LLMs would be considered severely mentally ill.