It did take more than a few decades to design, however. I wouldn't really expect a new model to ship inside of forty years. At best, that's like expecting the next version of Windows to ship in the next three hours.
That's true only to the extent that the evolution of machine intelligence proceeds via the same processes that led to human intelligence. Which is pretty much impossible, so if we ever get to AGI, we're going to be dealing with another type of evolution or design, and it's very possible that it will happen on a timescale much faster than that of biological evolution.
IMO the reason the Singularity concept is popular these days is that we're the first generation that is, barring a massive disaster that both throws Moore's Law off and prevents us from increasing parallel processing power, going to be in possession of reasonably priced machines that have more compute power than the human brain. Which means that it's only a matter of discovering the right algorithms to run.
Whether we're making progress on that is debatable, but don't mistake the slow progress at cracking "the" AI algorithm with the speed at which the intelligence explosion will take off after that - it took many long, slow years to figure out how to build a nuclear weapon, but once we figured out the trick and started a nuclear reaction, the thing blew up in a fraction of a second. AI is likely to be very similar, it may take us a long time to get there, but once we're there, watch out....
once we figured out the trick and started a nuclear reaction, the thing blew up in a fraction of a second
Please, never use this metaphor again. This level of magical thinking is just embarrassing for all of us. It's like saying that building a house must be a really fast process, because burning the house down goes really quickly.
Nuclear weapons blow up easily because, for (e.g.) plutonium, "blowing up" is thermodynamically favorable. A brick of plutonium has much less entropy than a giant fireball and an expanding cloud of radioactive fallout. In a thermodynamic sense, the plutonium nuclei want to be fissioned. All you have to do is find a way to coax them out of their metastable state.
A human brain is massively thermodynamically unfavorable. If you put a bunch of proteins in a box the odds that they will self-assemble into Einstein's brain are, literally, astronomically small.
That's why the brain is a miracle and a nuclear reaction is, frankly, pretty commonplace. There are a lot of stars. Stars are easy to explain. Brains are hard.
That's true only to the extent that the evolution of machine intelligence proceeds via the same processes that led to human intelligence. Which is pretty much impossible, so if we ever get to AGI, we're going to be dealing with another type of evolution or design, and it's very possible that it will happen on a timescale much faster than that of biological evolution.
IMO the reason the Singularity concept is popular these days is that we're the first generation that is, barring a massive disaster that both throws Moore's Law off and prevents us from increasing parallel processing power, going to be in possession of reasonably priced machines that have more compute power than the human brain. Which means that it's only a matter of discovering the right algorithms to run.
Whether we're making progress on that is debatable, but don't mistake the slow progress at cracking "the" AI algorithm with the speed at which the intelligence explosion will take off after that - it took many long, slow years to figure out how to build a nuclear weapon, but once we figured out the trick and started a nuclear reaction, the thing blew up in a fraction of a second. AI is likely to be very similar, it may take us a long time to get there, but once we're there, watch out....