Hacker News new | past | comments | ask | show | jobs | submit login

Then you are simply rejecting any attempts to refine the definition of AGI. I already linked to the Google DeepMind paper. The definition is being debated in the AI research community. I already explained that definition is too limited because it doesn't capture all of the intermediate stages. That definition may be the end goal, but obviously there will be stages in between.

> No, you can't, unless you're pre-supposing that LLMs work like human minds.

You are missing the point. If you reduce LLMs to "word autocompletion" then you completely ignore the the attention mechanism and conceptual internal representations. These systems have deep learning models with hundreds of layers and trillions of weights. If you completely ignore all of that, then by the same reasoning (completely ignoring the complexity of the human brain) we can just say that people are auto-completing words when they speak.






> I already linked to the Google DeepMind paper. The definition is being debated in the AI research community

Sure, Google wants to redefine AGI so it looks like things that aren’t AGI can be branded as such. That definition is, correctly in my opinion, being called out as bullshit.

> obviously there will be stages in between

We don’t know what the stages are. Folks in the 80s were similarly selling their expert systems as a stage to AGI. “Emerging AGI” is a bullshit term.

> If you reduce LLMs to "word autocompletion" then you completely ignore the the attention mechanism and conceptual internal representations. These systems have deep learning models with hundreds of layers and trillions of weights

Fair enough, granted.


> Sure, Google wants to redefine AGI

It is not a redefinition. It's a classification for AGI systems. It's a refinement.

Other researchers are also trying to classify AGI systems. It's not just Google. Also, there is no universally agreed definition of AGI.

> We don’t know what the stages are. Folks in the 80s were similarly selling their expert systems as a stage to AGI. “Emerging AGI” is a bullshit term.

Generalization is a formal concept in machine learning. There can be degrees of generalized learning performance. This is actually measurable. We can compare the performance of different systems.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: