Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Having a world model


If you read some of the studies of these new LLMs you'll find pretty compelling evidence that they do have a world model. They still get things wrong but they can also correctly identify relationships and real world concepts with startling accuracy.


No, they don't. They fail at the arithmetics ffs.


It fails at _some_ arithmetic. Humans also fail at arithmetic...

In any case, is that the defining characteristic of having a good enough "world model"? What distinguishes your ability understand the world vs. an LLM? From my perspective, you would prove it by explaining it to me, in much the same way an LLM could.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: