Hacker News new | past | comments | ask | show | jobs | submit login

The elephant in the room here is that these LLM's still have problems with hallucinations. Even its only 1% or even 0.1% of the time thats still a huge problem. You could have someone go their whole lives believing something they were confidently taught by an AI which is completely wrong.

Teachers should be very careful using a vanilla LLM for education without some kinds of extra guardrails or extra verification.




This is also the case if taught by any educator who happens to trust the source they looked up as well. The internet, text books, and even scientific articles can all be factually incorrect.

GNNs (for which LLMs are a subclass of) have a potential to be optimized in such a way that all the knowledge contained within them remains as parsimonious as possible. This is not the case for a human reading some internet article for which they have not gained extensive context within the field.

There are plenty of people that strongly believe in strange ideas that were taught to them by some 4th grade teacher that was never corrected over their life.

While you're statements are correct in this miniscule snapshot of time, it's exceedingly short-sighted to assert that language modeling is to be avoided due to some issues that exists this month, and disregard the clear future of improvements that will come very soon.


Damned, I'd have loved if my teachers only hallucinated 1% of the time. Instead we had the southern Baptist football coaches attempting to teach us science... poorly.


> The elephant in the room here is that these LLM's still have problems with hallucinations. Even its only 1% or even 0.1% of the time thats still a huge problem.

If you heard the bullshit that actual teachers say (both inside and outside of class), you would think that “1% hallucinations” would be a godsend.

Don’t get me wrong, some teachers are amazing and have a “hallucination rate” that is 0% or close to it (mainly by being willing to say they don’t know or they need to look something up), but these folks are the exceptions.

Education as a whole attracts a decidedly mediocre group of minds who sometimes (often?) develop god complexes.


My middle school history teacher hallucinated much more than 1%. Much more than 10%, really. He was so bad that I needed to "relearn" history in high school.


in my experience, its sometimes 100% of the time, even after repeated attempts to correct it with more specific prompts. Even on simple problems involving divisions or multiples of numbers from 1 to 10 with one additional operation.


What does "sometimes 100% of the time" mean exactly? You seem to be taking the "30% of the time it works every time" joke a bit literally.


The parent post probably means that it's not a random chance independent of the question, that while for some (many!) types of questions the hallucination rate is low, there exist some questions or groups of questions for which it will systematically provide misleading information.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: