My first answer was a bit hasty, let me try again;
We are clearly a product of our past experience (in LLMs this is called our datasets). If you go back to the beginning of our experiences, there is little identity, consciousness, or ability to reason. These things are learned indirectly, (in LLMs this is called an emergent property). We don't learn indiscriminately, evolved instinct, social pressure and culture guide and bias our data consumption (in LLMs this is called our weights).
I can't think of any other way our minds could work, on some level they must function like a LLM, Language perhaps supplemented with general Data, but the principle being the same. Every new idea has been an abstraction or supposition of someones current dataset, which is why technological and general societal advancement has not been linear but closer to exponential.
Genes encode a ton of behaviors, you can't just ignore that. Tabula rasa doesn't exist among humans.
> If you go back to the beginning of our experiences, there is little identity, consciousness, or ability to reason.
That is because babies brains aren't properly developed. There is nothing preventing a fully conscious being from being born, you see that among animals etc. A newborn foal is a fully functional animal for example. Genes encode the ability to move around, identify objects, follow other beings, collision avoidance etc.
>Genes encode a ton of behaviors, you can't just ignore that.
I'm not ignoring that, I'm just saying that in LLMs we call these things weights. And i don't want to downplay the importance of weights, its probably a significant difference between us and other hominids.
But even if you considered some behaviors to be more akin to the server or interface or preprocess in LLMs it still wouldn't detract from the fact that the vast majority of the things that make us autonomous logical sentient beings come about through a process that is very similar to the core workings of LLMs. I'm also not saying that all animal brains function like LLMs, though that's an interesting thought to consider.
We are clearly a product of our past experience (in LLMs this is called our datasets). If you go back to the beginning of our experiences, there is little identity, consciousness, or ability to reason. These things are learned indirectly, (in LLMs this is called an emergent property). We don't learn indiscriminately, evolved instinct, social pressure and culture guide and bias our data consumption (in LLMs this is called our weights).
I can't think of any other way our minds could work, on some level they must function like a LLM, Language perhaps supplemented with general Data, but the principle being the same. Every new idea has been an abstraction or supposition of someones current dataset, which is why technological and general societal advancement has not been linear but closer to exponential.