Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don’t know that it needs to experience the world in real-time, but when the brain thinks about things it’s updating its own weights. I don’t think attention is a sufficient replacement for that mechanism.

Reasoning LLMs feel like an attempt to stuff the context window with additional thoughts, which does influence the output, but is still a proxy for plasticity and aha-moments that can generate.



>I think this is true only if there is a novel solution that is in a drastically different direction than similar efforts that came before.

That's good point, we don't do that right now. it's all very crystalized.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: