> Jerry Dischler said this was not a “hallucination” – where AI systems invent untrue information – but rather a reflection of the fact the untrue information is contained in the websites that Gemini scrapes.
So LLMs like any other computer system suffer from "Garbage In, Garbage Out".
So LLMs like any other computer system suffer from "Garbage In, Garbage Out".