> I'd be prepared to argue that most humans aren't guessing most of the time.
Research suggests otherwise[1]. Action seems largely based on intuition or other non-verbal processes in the brain with rationalization happening post-hoc.
I've figured for an age that this is because consciously reasoning through anything using language as a tool takes time. Whereas survival requires me to react to the attacking tiger immediately.
If you believe that physics describe the rules by which the universe operates, then there's literally nothing in the universe a large and fast enough computer can't emulate.
Intuition is a guess based on experience. Sounds an awful lot to me like what LLMs are doing. They've even been shown to rationalize post-hoc just as Humans do.
Humans have incorrectly claimed to be exceptional from all of creation since forever. I don't expect we'll stop any time soon, as there's no consequence to suffer.
It's perfectly obvious to me that physics (excluding quantum physics which is scratching the surface) can't describe consciousness. If it could, it would have by now.
> I'd be prepared to argue that most humans aren't guessing most of the time.
Almost everything we do is just an educated guess. The probability of it being correct is a function of our education (for whatever kind of education is applicable).
For example: I guess that when I get out of bed in the morning, my ankles will support my weight. They might not, but for most people, the answer is probably going to be their best guess.
It's easy to see this process in action among young children as another example. They're not born knowing that they won't fall over when they run, then they start assuming they can run safely, then they discovered skinned knees and hands.
> I'd be prepared to argue that most humans aren't guessing most of the time.
Honestly interested about your arguments here. While unprepared, i'd actually be guessing the opposite, saying that most people are guessing most of the time.
There are plenty of things I know that have nothing to do with guessing.
I understand the incentives to pretend these algorithms are even approaching humans in overall capability, but reducing human experience like this is embarrassing to watch.
Go do some hallucinogenics, meditate, explore the limits a tiny bit; then we can have an informed discussion.
> I understand the incentives to pretend these algorithms are even approaching humans in overall capability, but reducing human experience like this is embarrassing to watch.
Seems like you were very much guessing what i believe. And you were not right.
I don't agree with the people who think LLMs are close to human-level-anything. But i do believe, many smarter people like you, who i agree with in the most part, do underestimate how much of what we do and believe is the result of insane, yet still just, information processing & most of what brought us so far, is instinct. The brain is good at providing stories to make us feel in control. But given enough human experience and time, one will be surprised, what artificial systems can emulate. Not to mention how much of human behaviour is emulation itself.
I'd be prepared to argue that most humans aren't guessing most of the time.