Check on the curve for flight speed sometime, and see what you think of that, and what you would have thought of it during the initial era of powered flight.
Maybe a different analogy will make my point better. Compare rocket technology with jet engine technology. Both continued to progress across a vaguely comparable time period, but at no point was one a substitute for the other except in some highly specialized (mostly military-related) cases. It is very clear that language models are very good at something. But are they, to use the analogy, the rocket engine or the jet engine?
I doubt that that’s a sustained exponential growth. As far as I know, there is no power law that could explain it, and from a computational complexity theory point of view it doesn’t seem possible.
See https://www.lesswrong.com/posts/J6gktpSgYoyq5q3Au/benchmarki.... The short answer is that linear elo growth corresponds roughly linearly to linear evaluation depth, but since the game tree is exponential, linear elo growth scales with exponential compute. The main algorithmic improvements are things that let you shrink the branching factor, and as long as you can keep shrinking the branching factor, you keep getting exponential improvements. SF15 has a branching factor of roughly 1.6. Sure the exponential growth won't last for ever, but it's been surprisingly resilient for at least 30 years.
It wouldn’t have been possible if there hadn’t been an exponential growth in computing resources over the past decades. That has already slowed down, and the prospects for the future are unclear. Regarding the branching factor, the improvements certainly must converge towards an asymptote.
The more general point is that you always end up with an S-curve instead of a limitless exponential growth as suggested by Kaibeezy. And with AI we simply don’t know how far off the inflection point is.
For how long can we better up our game? GPT-4 comes less than half a year after ChatGPT. What will come in 5 years? What will come in 50?