It's also possible it isn't AGI hard and all you need is the ability to experiment with code along with a bit of agentic behavior.
An AI doesn't need embodiment, understanding of physics / nature, or a lot of other things. It just needs to analyze and experiment with algorithms and get us that next 100x in effective compute.
The LLMs are missing enough of the spark of creativity for this to work yet but that could be right around the corner.
It’ll probably sit in the human hybrid phase for longer than with chess where the AGI tools make the humans better and faster. But as long as the tools keep getting better at that there’s a strong flywheel effect
From there things will probably go very fast. Self driving cars can't design themselves, once AI gets good enough it can