This assumes that software products in the future will remain at the same complexity as they are today, just with AI building them out.
But they won’t. AI will enable building even more complex software which counter intuitively will result in need even more human jobs to deal with this added complexity.
Think about how despite an increasing amount of free open source libraries over time enabling some powerful stuff easily, developer jobs have only increased, not decreased.
What about "general" in AGI do you not understand? There will be no new style of development for which the AGI will be poorly suited that all the displaced developers can move to.
For true AGI (whatever that means, lets say fully replicates human abilities), discussing "developers" only is a drop in the bucket compared to all knowledge work jobs which will be displaced.
More likely they will tailor/RL train these models to go after coders first. Use RLHF employing coders where labor is cheap to train their models. A number of reasons for this of course:
- Faster product development on their side as they eat their own dogfood
- Dev's are the biggest market in the transition period for this tech. Gives you some revenue from direct and indirect subscriptions that the general population does not need/require.
- Fear in leftover coders is great for marketing
- Tech workers are paid well which to VC's, CEO's, etc makes it obvious where the value of this tech comes from. Not with new use cases/apps which would be greatly beneficial to society - but effectively making people redundant saving costs. New use cases/new markets are risky; not paying people is something any MBA/accounting type can understand.
I've heard some people say "its like they are targeting SWE's". I say; yes they probably are. I wouldn't be surprised if it takes SWE jobs but otherwise most people see it as a novelty (barely affects their life) for quite some time.
I've made a similar argument in the past but now I'm not so sure. It seems to me that developer demand was linked to large expansions in software demand first from PCs then the web and finally smartphones.
What if software demand is largely saturated? It seems the big tech companies have struggled to come up with the next big tech product category, despite lots of talent and capital.
There doesn’t need to be a new category. Existing categories can just continue bloating in complexity.
Compare the early web vs the complicated JavaScript laden single page application web we have now. You need way more people now. AI will make it even worse.
Consider that in the AI driven future, there will be no more frameworks like React. Who is going to bother writing one? Instead every company will just have their own little custom framework built by an AI that works only for their company. Joining a new company means you bring generalist skills and learn how their software works from the ground up and when you leave to another company that knowledge is instantly useless.
Sounds exciting.
But there’s also plenty of unexplored categories anyway that we can’t access still because there’s insufficient technology for. Household robots with AGI for instance may require instructions for specific services sold as “apps” that have to be designed and developed by companies.
The new capabilities of LLMs, and generally large foundation models, expands the range of what a computer program can do. Naturally, we will need to build all of those things with code. Which will be done by a combo of people with product ideas, engineers, and LLMs. There will be then specialization and competition on each new use-case. eg., who builds the best AI doctor etc.,.
This is exactly what will happen. We'll just up the complexity game to entirely new baselines. There will continue to be good money in software.
These models are tools to help engineers, not replacements. Models cannot, on their own, build novel new things no matter how much the hype suggests otherwise. What they can do is remove a hell of a lot of accidental complexity.
> These models are tools to help engineers, not replacements. Models cannot, on their own, build novel new things no matter how much the hype suggests otherwise.
But maybe models + managers/non technical people can?
But they won’t. AI will enable building even more complex software which counter intuitively will result in need even more human jobs to deal with this added complexity.
Think about how despite an increasing amount of free open source libraries over time enabling some powerful stuff easily, developer jobs have only increased, not decreased.