Hacker News new | past | comments | ask | show | jobs | submit login

You may be interested "Deterministic Quoting"[1]. This doesn't completely "solve" hallucinations, but I would argue that we do get "good enough" in several applications

Disclosure: author on [1]

[1] https://mattyyeung.github.io/deterministic-quoting




Have seen this approach before.

It's the yes we hallucinate but don't worry because we provide the sources for users to check.

Even though everyone knows that users will never check unless the hallucination is egregious.

It's such a disingenuous way of handling this.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: