Where singularity = something advanced enough comes along that we can't understand or predict or keep up with it, because it's so far beyond us and changing so far faster than our ape brains can perceive, and (hopefully) it brings us along for the ride.
By that definition, I wonder if we've already surpassed that point. Things on the horizon certainly feel hazier to me, at least. I think a lot of people were surprised by the effectiveness of the various GPTs, for example. And even hard science fiction is kinda broken: humans piloting spaceships seems highly unlikely, right? But it's a common occurrence there.
The idea is that eventually we build something that, when it plateaus, builds its own successor. That’s the singularity: when the thing in question builds its successor and that builds its successor and this happens far outside our ability to understand or keep up.
Can GPT9 build GPT10, with zero human input?
I’d give 50/50 odds it can.
Can GPT15 build something that isn’t a large language model and is far superior in every way?
I’d give 50/50 odds it can.
Can both the above steps happen within one solar rotation of each other?
I’d give 50/50 odds they can.
Because at some point these models won’t need humans to interact with them. Humans are very slow- that’s the bottleneck.
They’ll simply interact with their own previous iterations or with custom-instantiated training models they design themselves. No more human-perceptible timescale bottlenecks.
Well for Homo sapiens the odds are probably a hundredth or a thousandth of that.
It’s 50/50 that in 150 years some version of our descendants will exist, i.e. something that you can trace a direct line from Homo sapiens to. Say a Homo sapiens in a different substrate, like “human on a chip”.
The thing is if you can get “human on a chip” then you probably also can get “something different and better than human on a chip”, so why bother.
By the 24th century there’ll be no Homo sapiens Captain Picard exploring the quadrant in a gigantic ship that needs chairs, view screens, artificial gravity, oxygen, toilets and a bar. That’s an unlikely future for our species.
More likely whatever replaces the thing that replaces the thing that replaced us won’t know or care about us, much less need or want us around.
I honestly don't think it will be quite like that, at least not terribly soon. There is so much work being done to hook up LLMs to external sources of data, allow them to build longer term memories of interactions, etc. Each of these areas are going to have massive room to implement competing solutions, and even more room for optimization.
> He was an uninformed crackpot with a poor understanding of statistics.
There's a lot you can say about Kurzweil being inaccurate in his predictions, but that is way too demeaning. Here's what Wikipedia has to say about him and the accolades he received:
Kurzweil received the 1999 National Medal of Technology and Innovation, the United States' highest honor in technology, from then President Bill Clinton in a White House ceremony. He was the recipient of the $500,000 Lemelson-MIT Prize for 2001. He was elected a member of the National Academy of Engineering in 2001 for the application of technology to improve human-machine communication. In 2002 he was inducted into the National Inventors Hall of Fame, established by the U.S. Patent Office. He has received 21 honorary doctorates, and honors from three U.S. presidents. The Public Broadcasting Service (PBS) included Kurzweil as one of 16 "revolutionaries who made America" along with other inventors of the past two centuries. Inc. magazine ranked him No. 8 among the "most fascinating" entrepreneurs in the United States and called him "Edison's rightful heir".
I’ve been a Kurzweil supporter since high school, but to the wider world he was a crackpot (inventor who should stick to his lane) who had made a couple randomly lucky predictions.
He wasn’t taken seriously, especially not when he painted a future of spiritual machines.
Recently on the Lex Fridman podcast he himself said as much: his predictions seemed impossible and practically religious in the late 90s and up until fairly recently, but now experts in the field are lowering their projections every year for when the Turing test will be passed.
Half of their projections are now coming in line with the guy they had dismissed for so long, and every year this gap narrows.
That would be my response but without the /s. Of course, depending on the definition it can always be said to be "happening", but to me it feels like the angle of the curve is finally over 45 degrees.