Apparently, the rabbit hole goes deeper. The wayback machine puts the "AI Generated" description on the cheese website at August 7th, 2020. The AI didn't hallucinate anything because it didn't generate anything, the entire premise is simply fake.
> The AI didn't hallucinate anything because it didn't generate anything, the entire premise is simply fake.
The article literally says it's not a hallucination and that the detail came from real websites.
"Google executive Jerry Dischler said this was not a “hallucination” – where AI systems invent untrue information – but rather a reflection of the fact the untrue information is contained in the websites that Gemini scrapes..."
> The article literally says it's not a hallucination and that the detail came from real websites.
The "hallucination" term generally refers to any made-up facts. Harsh as it may be to put this weight of responsibility on LLMs, users of LLMs generally use them in the expectation that what is says is true, and has been (in some magic hand-wavy way) cross-checked or confirmed as factual. Instead they will print out what is most likely to follow the user's input, based on the training data.
Unfortunately, a vast amount of that vast corpus of training data is social media posts which can't be relied upon to be true. But if it gets repeated a lot then it's treated as true in the sense that "what does 'salary' mean" is generally followed by a billion social media posts saying "it referred to the time that Romans soldiers were paid in salt, because salt was a currency at the time"
Ok. None of what you said points to the commenter having discovered something novel or the other article being better. It's already stated in the OP article that the problem is caused by the internet containing false information.
> Apparently, the rabbit hole goes deeper.
But it doesn't go deeper than what's already in the article. The article already talks about how the problem is that the internet contains a bunch of misinformation and the LLM is as credulous as the average human, which is to say extremely so.
> The article literally says it's not a hallucination and that the detail came from real websites
The article says that's what a Google exec claims, not that that's the actual case. They haven't pointed to any of those websites and we don't have to take them at their word.
Someone further down pointed to a source on cheese.com, where it says gouda makes up 50% to 60% of all global consumption of Dutch cheese. If the source is accurate, the AI hallucinated an incorrect response.
https://web.archive.org/web/20200807133049/https://www.wisco...
The (edited) cheese ad: https://www.youtube.com/watch?v=I18TD4GON8g
What probably should be the target link: https://www.theverge.com/news/608188/google-fake-gemini-ai-o...