Hacker News new | past | comments | ask | show | jobs | submit login

The superficial view: „they hallucinate“

The underlying cause: 3rd order ignorance:

3rd Order Ignorance (3OI)—Lack of Process. I have 3OI when I don't know a suitably efficient way to find out I don't know that I don't know something. This is lack of process, and it presents me with a major problem: If I have 3OI, I don't know of a way to find out there are things I don't know that I don't know.

—- not from an llm

My process: use llms and see what I can do with them while taking their Output with a grain of salt.




But the issue of the structural fault remains. To state the phenomenon (hallucination) is not "superficial", as the root does not add value in the context.

Symptom: "Response was, 'Use the `solvetheproblem` command'". // Cause: "It has no method to know that there is no `solvetheproblem` command". // Alarm: "It is suggested that it is trying to guess a plausible world through lacking wisdom and data". // Fault: "It should have a database of what seems to be states of facts, and it should have built the ability to predict the world more faithfully to facts".




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: