Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>the human brain is just an LLM

Never have I said such a thing. Also, BPEs aren't a natural limitation of any transformer-based model, it's a trick to save compute for LLMs.



So what is this supposed to mean?

> We have no good definition for what sentience is either; maybe our brain is word association shenanigans;

Sure sounds like you're suggesting the brain is an LLM to me, and I can't blame bonsaibilly for thinking that.


Keeping track of different usernames is hard, and I can't blame you for confusing me with the other guy :)

snark aside, I think he does have a point - afaik, we don't know what intelligence is, so it's kinda hard to make any argument about fundamental differences between "true" intelligence and LLM intelligence. I do feel like there should be something else. I saw somebody describe their mind as consisting of a "babbler" and a "critic" (a GAN, basically). The LLM would be the babbler while the critic is not yet implemented, and this sounds intuitively right to me. Then again, my intuition could be completely wrong and we may be able to get to human level intelligence with further scaling. I haven't seen any solid counterarguments yet. And not even the biggest LLM believers are denying the fact that it's not exactly the same thing as a human brain, but the question is whether it captures the gist of it.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: