Self sustained and totally independent mental capacity by an IT system... The ability to create and store memory and reasoning on it's own... This definition is not made by me, it's also a lot more vast... If you look up Spielberg's AI or I robot, Terminator, or any of those other films or books ln the matter, the definition is out there.
Use of the word "Intelligence" in Artificial Intelligence implies and indicates that humans are not involved in the equation past the point of initial creation and that it sustains itself and grows on it's own after a point... So far the various GPT models solely rely on human intervention and updates, which is bewildering to some like me why it's being marketed as Ai.
If you read some of the studies of these new LLMs you'll find pretty compelling evidence that they do have a world model. They still get things wrong but they can also correctly identify relationships and real world concepts with startling accuracy.
It fails at _some_ arithmetic. Humans also fail at arithmetic...
In any case, is that the defining characteristic of having a good enough "world model"? What distinguishes your ability understand the world vs. an LLM? From my perspective, you would prove it by explaining it to me, in much the same way an LLM could.