Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Going from "oh my goodness, this is intelligent" fade to "oh, it's just predicting text responses"

Eventually your father will reach the third stage: "Uh, wait, that's all we do." You will then have to pry open the next niche in your god-of-the-gaps reasoning.

The advent of GPT has forced me to face an uncomfortable (yet somehow liberating) fact: we're just plain not that special.



Haha, I think he's already at that point with respect to humanity. All my childhood he impressed upon us that we're not special, that only hard work and dedication will get you somewhere in life.

It's a small leap to apply that to general intelligence, I would think.

You are right though, we are coming closer and closer to deciphering the machinations of our psyche's. One day we'll know fully what it is that makes us tick. When we do, it will seem obvious and boring, just like all the other profound developments of our time.


We reflect, we change, we grow. We have so many other senses that contribute to our "humaness". If you listen to and enjoy music tell me how those feelings are just "predictive text responses".

Communication is one part of being human. A big part for sure, but only one of many.


What is the qualitative difference between one type of perception and the other?

“Text” are tokens. Tokens are abstract and can be anything. Anything that has structure can be modeled. Which is to say all of reality.

We have a lot of senses indeed. Multimodal I believe it’s called in ML jargon.

I don’t know where enjoyment itself comes from. I like to think it’s a system somewhere that predicts the next perception right getting rewarded.

Qualia are kind of hard to pin down as I’m sure you’ll know.


Yes, wholly agree. The special parts are in language. Both humans and AI are massively relying on language. No wonder AIs can spontaneously solve so many tasks. The secret is in that trillion training tokens, not in the neural architecture. Any neural net will work, even RNNs work (RWKV). People are still hung up on the "next token prediction" paradigm and completely forget the training corpus. It reflects a huge slice of our mental life.

People and LLMs are just fertile land where language can make a home and multiply. But it comes from far away and travels far beyond us. It is a self replicator and an evolutionary process.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: