I like to joke with people that us programmers automated our jobs away decades ago, we just tell our fancy compilers what we want and they magically generate all the code for us!
I don't see LLMs as much different really, our jobs becoming easier just means there's more things we can do now and with more capabilities comes more demand. Not right away of course.
What's different is compilers do deterministic, repetitive work that's correct practically every time. AI takes the hard part, the ambiguity, and gets it sorta ok some of the time.
The hard part is not the ambiguous part and it never were. You just need to talk with the stakeholder to sort it out. That's the requirement phase and all it requires is good communication skills.
The hard part is to have a consistent system that can evolve without costing too much. And the bigger the system, the harder it is to get this right. We have principles like modularity, cohesion, information hiding,... to help us on that front, but not a clear guideline on how to achieve it. That's the design phase.
Once you have the two above done, coding is often quite easy. And if you have a good programming ecosystem and people that know it, it can be done quite fast.
No he's right - compilers pretty much do the same thing every time. It's very rare that there's bugs in compilers, and even if the assembly is different, if it works the same it doesn't matter.
100% agree with this thread, because it's the discussion about why no code (and cloud/SaaS to a lesser degree) failed to deliver on their utopian promises.
Largely, because there were still upstream blockers that constrained throughput.
Typically imprecise business requirements (because someone hadn't thought sufficiently about the problem) or operation at scale issues (poorly generalizing architecture).
> our jobs becoming easier just means there's more things we can do now and with more capabilities comes more demand
This is the repeatedly forgotten lesson from the computing / digitization revolution!
The reason they changed the world wasn't because they were more capable (versus their manual precursors) but because they were economically cheaper.
Consequently, they enabled an entire class of problems to be worked on that were previously uneconomical.
E.g. there's no company on the planet that wouldn't be interested in more realtime detail of its financial operations... but that wasn't worth enough to pay bodies to continually tabulate it.
>> The NoCode movement didn't eliminate developers; it created NoCode specialists and backend integrators. The cloud didn't eliminate system administrators; it transformed them into DevOps engineers at double the salary.
Similarly, the article feels around the issue here but loses two important takeaways:
1) Technologies that revolutionize the world decrease total cost to deliver preexisting value.
2) Salary ~= value, for as many positions as demand supports.
Whether are more or fewer backend integrators, devops engineers, etc. post-transformation isn't foretold.
In recent history, those who upskill their productivity reap larger salaries, while others' positions disappear. I.e. the cloud engineer supporting millions of users, instead of the many bodies that used to take to deliver less efficiently.
It remains to be seen whether AI coding will stimulate more demand or simply increase the value of the same / fewer positions.
PS: If I were career plotting today, there's no way in hell I'd be aiming for anything that didn't have a customer-interactive component. Those business solution formulation skills are going to be a key differentiator any way it goes. The "locked in a closet" coder, no matter how good, is going to be a valuable addition for fewer and fewer positions.
I don't see LLMs as much different really, our jobs becoming easier just means there's more things we can do now and with more capabilities comes more demand. Not right away of course.