> The best information we have now is if we create AGI/ASI at this time, we all die.
We can still unplug or turn off the things. We are still very faraway from the situation where AI has some factories and full supply chain to control and take physical control of the world.
Meanwhile, every giant AI company: "yeah we're looking at robotics, obviously if we could embody these things and give them agency in the physical world that would be a great achievement"
Our rush into AI and embodiment reminds me of the lily pad exponential growth parable.
>Imagine a large pond that is completely empty except for 1 lily pad. The lily pad will grow exponentially and cover the entire pond in 3 years. In other words, after 1 month there will 2 lily pads, after 2 months there will be 4, etc. The pond is covered in 36 months
We're all going to be sitting around at 34 months saying "Look, it's been years and AI hasn't taken over that much of the market.
I mean, you spend a lot of time in your own life denying the inevitable, humans spend a lot of time and effort avoiding their own personal extinction.
>The best chance is still
The best information we have now is if we create AGI/ASI at this time, we all die. The only winning move is not to play in that game.