I don’t know that it needs to experience the world in real-time, but when the brain thinks about things it’s updating its own weights. I don’t think attention is a sufficient replacement for that mechanism.
Reasoning LLMs feel like an attempt to stuff the context window with additional thoughts, which does influence the output, but is still a proxy for plasticity and aha-moments that can generate.
Reasoning LLMs feel like an attempt to stuff the context window with additional thoughts, which does influence the output, but is still a proxy for plasticity and aha-moments that can generate.