Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Will AI put programmers our of work?
68 points by thisiswrongggg on Dec 11, 2022 | hide | past | favorite | 110 comments
There's a lot of news regarding copilot and openAI and what have you. I'm not familiar with AI so I cannot really tell hype from substance here.

Should I worry? Do you think that some form of AI will be able to do the job of an average programmer any time soon? If yes what is your estimate? And how would you try to AI-proof your career?




Almost all the work of programmers is already automated in the form of everything you get via `pip`, `apt get` etc.

A very large part of what remains is the bit which cannot be automated: modelling real world (business) process in terms of the systems of automation which are available.

Programming is a modelling activity which is about phrasing sequences of available actions to represent a process. If AI systems generate code, then programming becomes the sequencing of AI prompts -- which are here then just a more natural language like version of programming.

Even in that world a significant amount of technical skill is required to ensure commands are sequenced correctly, the code is correct, etc.

For "AI" to replace this process it would not only have to be AGI, but also AGI fully embeded in the human social world of the processes were are modelling.


In another life, I designed workflows that could be picked up by offshore teams to execute the actual work.

My observation was that a lot of my colleagues had no appetite for reasoning about processes, much less thinking through various edge cases to make sure the work was done correctly and covered enough cases to be a useful workflow with low incidents.

Colour me skeptical but I'm not convinced we will see an AGI that can solve business problems without killing the proverbial cat without lots of baby sitting.

So is programming the business of copy pasting from stackoverflow or is it the business of solving problems?


> So is programming the business of copy pasting from stackoverflow or is it the business of solving problems?

Both, but what you’ve missed is you’re still putting some devs out of work. And solving business problems is absolutely on the burner for AI right now so give them a few years and it will solve that too.


I remain unconvinced, there's a huge swathe of legacy industries that the pandemic showed up for being woefully stuck in the past - with "digitalisation" being a buzzwords thrown around during 2020-2021 for older industries that thought working in the 21st century entailed emailing CSVs to each other, working on physical desktops.

I don't think these types of slow moving companies will be the ones to leap frog by jumping all in on AI, moreover I believe the TAM for software development is still growing strongly, AND I'm actually quite interested what happens to the nature of work 10-20 years down the line when most of today's kids who will be able to sort of code and become hybrid workers (to how traders went from shouting in a floor to being quite numeric, or accountants went from physical books to excel).

Tldr my bet is that AI might displace labour but not lead to a net reduction of software labour demand in the next 10-20 years at least


This comment aptly summerizes why I hate most "modern" programming jobs.

This is not programming. This is not what I had in mind when I signed up for Computer Science school.

> modelling real world (business) process in terms of the systems of automation which are available.

In other words, programming by analogy.


Programming is always a modelling of real-world process. If you want to program directly for the CPU, it is this device which provides you with operation primitives which you sequence. Perhaps you mean you'd prefer to model more interesting processes.

But if you mean algorithm design, that isn't programming. Algorithms arent programs, and the "operations" that they "sequence" are abstract. CSCi alg. design is more like geometry.

Programming is an empirical discipline; it uses the "geometry" of csci to build applications.


Programming is almost never about modelling a real world process - unless you're doing some kind of physics simulation.

Programming is mainly about data transformations.


All "real world" here means is that the program's semantics are correct given empirical conditions. Here, since "data" just means columns which are measures of the world, the program is modelling a real-world process.

Contrast that with algorithms whose semantics are abstract and are correct given essentially mathematical laws, rather than empirical conditions.


My immediate guess is that these kinds of tools will greatly enlarge the productivity gap between junior and senior developers.

Senior devs will be able (they already are) to generate code, at first boilerplate and gradually more complex code, and effectively work as planners and passive reviewers, in a similar way to how some companies just hire legions of juniors with some architects/seniors guiding and reviewing their work.

The problem with that flow, I think, is that it completely disrupts the junior to senior pipeline. Senior roles might be valued even more than today, but reaching that stage or simply entering the market might become much more difficult.

I feel my career is pretty safe, but I’m not sure about someone joining the industry 5 years from now.


That's an interesting take. The ChatGPT seems to do very well at doing what it's told, but it needs very explicit requirements and supervision to discover missing requirements. It's a junior developer, basically.

That could put junior developers out of a job just like glass terminals put many data entry and server room operators out of jobs.


I agree with this. These AI tools have caused my (not a junior) productivity to skyrocket as a result of eliminating almost every instance of needing to check documentation/stackoverflow/etc. for easy questions in addition to providing most of my annoying boilerplate code. Since I am my own expert opinion on whether the output is correct/satisfactory, I can quickly breeze through verification of output & make corrections as necessary.


Yup. And provided you’re a newcomer, the path to becoming able to provide that expert opinion involves years of performing precisely the tasks that AI is getting rid of.

My guess is that we’ll eventually see some kind of “higher degree bootcamps” that accept people with junior baseline skills and take them to the senior level, since learning on the job might become less feasible.


The Innovator's Dilemma book popularized the idea that it's frequently newcomers who are the beneficiaries of breakthrough changes for mostly structural reasons. Could it apply to individual people too in this case?


In my experience, non-programmers lack not only the ability to translate concepts to code, but to formulate those concepts in the first place.

Ask someone who isn't a software engineer to guide you in developing a simple program. You do all the actual coding, they just tell you what they want to happen. You will find that most people are unable to describe even high-level actions in the form of a coherent, procedural sequence.

Even the best code-generating AI won't solve this problem, since it cannot generate useful code if the operator cannot articulate what they actually need.


Agreed. The mark of a senior developer is their ability to decouple concepts and synthesize concepts appropriately depending on the context.

You need to have highly developed critical thinking skills in order to be able to formulate appropriate solutions to problems. What a developer does isn't so different from what an entrepreneur does; they spend most of their time explaining their vision and goals; the difference is that the developer's vision and goals are much more granular and detailed.


Not but if AI is able to understand what a non-programmer is trying to accomplish it will do it.


Then we'd automate the non-programmer first.


Of course not.

For the same reason why companies pay $2000 per day for an experienced consultant when an employee could theoretically build the same stuff at minimum wage. Sometimes, mistakes are expensive. And then you need people who can reason about why they are doing what they are doing. AI can maybe churn out CRUD better than other generators, but when you have any significant amount of money depending on the software working, nobody is going to use ChatGPT without a human code review.

But ChatGPT code is typically overly lengthy and complicated, just like what a beginner would produce. And that makes for expensive and slow code reviews. That's why in the end it's cheaper overall to skip all that and just hire a professional.


It's excellent for menial programming tasks like fill up database with dummy data or generating boilerplate code.


For now, 2-3 years ago it couldn’t even generate code. 2-3 years from now will be very different.


Historically, many approaches to AI have shown excellent early results but then plateaued. There's no guarantee that this approach will continue to have major improvements in results.


There’s also no guarantee it won’t. Not sure how this is supposed to give someone comfort.


The question wasn't just about ChatGPT though. If the SOTA improves as much in the next decade as it has in the last decade (for from a guarantee of course), it will be hard for any tech company to justify having thousands of developers.


Yes.

But in my experience, a small and highly skilled team already outcompetes these armies right now.

Working as an "IT consultant" after a 2 week bootstrap course has always been unsustainable. We will likely get rid of 90% of the current "software engineers" without any meaningful reduction in productivity.


>Should I worry? Do you think that some form of AI will be able to do the job of an average programmer > any time soon?

Yes.

> If yes what is your estimate?

5 to 10 years will see a noticeable decrease in the number of programming jobs as we know them today.

> And how would you try to AI-proof your career

Historically tech changes end up leaving a small rump of niche based practitioners, e.g. Blacksmiths servicing riding stables and racing yards, while the majority either exit the industry or take up the skillsets for the new technology. To future proof against AI, it's either be about finding the niche in a shrinking market or changing skillsets.

In terms of those skillsets.

Without Artifical General Intelligence there is going to be a need for someone to translate human requirements into something that the "machine" - however sophisticated - understands _and_ verify the results afterwards. That sounds very much like some form of Behaviour Driven Development.

As to niches; there are a lot of complex, ill-defined but essential Cobol, Perl, PHP and Python systems floating around. Verifying a new translations is going to be expensive. QED; specialists keeping those existing systems ticking along is likely to be a thing long enough to make a career from.


No, but tech consolidation probably will. Tech is following in the same footsteps as the American car industry in the 50s.

Just as vertical integration in the auto industry killed off the auto startup ecosystem, vertical integration in the tech industry will kill off tech startups. This isnt because there won't be demand for innovative new tech or that startups won't be able to innovate, but because control of core platforms will allow the bigger players more leeway to crush and swallow smaller companies as well as to siphon their profit margins.

Think what aws is doing to elastic on a large scale.

Once the tech startup ecosystem dies (which could be soon; high interest rates will suck capital away from startups), the behemoths will probably stop innovating and slash headcounts.

Once that happens, I'm pretty sure that the stewards of capital and captains of industry will scapegoat AI and the Economist and Time magazine and the like will dutifully believe them and so will most of the people who read them.


As I see it, it generates fakes or potentially correct solutions that require review and alteration. So it’s best employed by a professional that can judge the generated result in a wider context.

Not too different from interpreting lighthouse scores.

It can make anyone with the means to access it a bad coder but the value of a professional has always been in picking the best solution from the possible ones.


> but the value of a professional has always been in picking the best solution from the possible ones.

When you actually decide the best solution you list pros and cons, weigh risk and cost, etc. These are all easily automated through least cost optimization. Not even using AI.


Having toyed around with ChatGPT for a week now, I already use it as a sort of "code assistant", much like how copilot works.

How good the results are, depends on how good you are at constructing the queries. It's a bit like using google - some people have only the most basic knowledge, while others can find pretty much anything, because they're really good at writing queries.

Now, imagine how it's going to be 10-15-20 years from now? The future models will probably cull a chunk of devs., while the good devs will be even more efficient.

But who knows, maybe this will actually help some of the mediocre programmers to focus their energy on other things? Like taking on other roles? Take me for example - I'm no rock-star programmer. I see programming as a means to an end, it's just a tool. I would much prefer to focus on the actual business logic and features of a product, maybe even long-time strategy. If most of the tedious coding was removed, that would make me happy.


We created ChatGPT, Copilot et el. But are they able to generate the code for a more capable version of themselves? Until we're at that point I think most jobs are safe, whether we get to that point is in my view still a philosophical question.


I used to reply with a joke answer to questions like these along the lines of

> wake me up when it starts to delete bad code instead of writing more of it

I too don't follow AI space closely, but I have hard time imagining it doing anything of the sort. It probably can't even remove a bug in a trivial program.

Correct me if I'm wrong of course, if there's a video that demonstrates something important/impressive related to this then I would certainly watch it.


You really should try it out.


Probably the opposite. Technology drives hunger for more complex systems


The current crop of generative AI all seems like it could produce useful tools for various practitioners, programmers included, I don't yet see evidence for replacing jobs. But it's hard to extrapolate.

That said the best way to anything-proof this career that's so lucrative is to be frugal and set yourself up to retire early.


The current crop of AI didn’t exist 5 years ago. Generative AI art was still a pipe dream. Now it has a command prompt.


I don't know how the latest stuff from OpenAI will impract the career of average programmers.

But regarding your last question:

> how would you try to AI-proof your career?

Learn to program from first principles.


Yeah, it is likely that AI can learn to glue code from stack overflow pretty soon. Harder programming that requires you to do a bit of thinking seems more future proof, these models aren't good at thinking at all instead they just pattern match and glue in previous solutions.


We’re just the next new model away. AI researches seem apt on burning the world down to see what comes out of it, so this absolutely is the goal.


How do you learn to program form first principles?


In my humble opinion, Mike Acton's talks are the best introduction, although he doesn't use the term "first principles", and they seem - at the surface - to be about game engines, but they are generalizable to everything.

Data Oriented Design (2014):

https://www.youtube.com/watch?v=rX0ItVEVjHc

Solving the Right Problems (2017):

https://www.youtube.com/watch?v=4B00hV3wmMY


"waste" a lot of time reinventing wheels (the lower level the better, ultimate example would be writing your own OS) and other stuff that has no practical outcome for your immediate problems.


This is probably the best adice one cold give ^


In a 5-10 years, generally-available tools resembling ChatGPT will be able to do about 90% of what a programmer does, but the remaining 10% will remain AI-complete.

The AI-complete bit is stuff like reading a set of instructions and realising that one of the instructions doesn't make sense given the other ones, and pushing back. It's looking through an UPDGRADING.md file to see how an API changed, realising that the thing you want isn't there, then looking at the actual source-code of a library to understand an undocumented breaking change. It's understanding how to write a program such that it's easy to parallelise in the future.


Funnily enough I was just asking chatGPT what it thought about it, so here it goes:

>It is unlikely that AI will completely replace programmers. While AI and machine learning technology has advanced significantly in recent years, there are still many tasks that require human creativity and intuition, such as coming up with new ideas, solving complex problems, and making decisions that involve subjective judgement. Additionally, as AI and machine learning technology continues to advance, it is likely that new job opportunities will be created in fields related to these technologies, such as developing and managing AI systems.


Altering the prompt a bit (how will AI put programmers out of work?)

>It is unlikely that AI will completely put programmers out of work. However, it is possible that certain tasks currently performed by programmers, such as routine and repetitive tasks, could be automated by AI in the future. This could potentially lead to job displacement for some programmers. It is also possible that the increasing use of AI could lead to a shift in the types of skills and expertise that are in demand in the job market, potentially making some programming skills and knowledge less valuable. However, as AI and machine learning technology continue to evolve, it is likely that new job opportunities will be created in fields related to these technologies, such as developing and managing AI systems.


Which is about the same what would Skynet say :)


I’ve downvoted this because the question is what the community thinks of the question. This is creating plausible spam.


I just thought it was interesting to post because:

chatGTP has been featuring quite a lot on HN

I had literally just asked the same question to chatGTP

and it's asking an interesting/relevant question to chatGTP

I do get your point about spam though


I'm working at a startup's engineering & product team as an engineer. I think it's not fundamentally different from how most software engineers work.

We work on:

- The why: business/leadership, and to some extent, product determines key goals (KPIs, OKRs etc.) and markets we are addressing. Us engineers are only informed parties here.

- The what: product, and to some extent, engineering determines which features to build or improve to attain these goals in a user-friendly and sustainable way. We create projects and design sketches, assign time constraints to them.

- The how: engineering, and to some extent, product ships these features in a maintainable, scalable, performant and supportable way without disrupting existing user experience.

A very large part of this work involves creative processes and logical reasoning about business, UX, software engineering problems. (Of course, that's why I love it :))

Only a tiny amount of this work is "writing boilerplate code" or copying code from Stackoverflow - which ChatGPT is presumed to automate.

Of course, more senior engineers are faster at writing boilerplate, but their speed mostly comes from 1) knowing the existing codebase 2) using the right tools & abstractions.

Moreover, most of the risk involved in the process is not the time taken to write boilerplate i.e. working on something for too long - but rather working on the wrong thing, or doing an implementation that's too slow or hard to maintain (change, test, fix, extend, reason about).

All in all, when I think about software engineering from this perspective, I don't see AI automating it away anytime soon.

I could, though imagine AI being your TDD coder companion. You write some unit or acceptance tests for a service module and the AI generates the code for it. You'd still need to thoroughly review and test the code though. This would work well for basic CRUD/boilerplate modules, but not for anything involving business logic.

Nevertheless, this would still remain just a small part of a software engineer's work, in my opinion. What do you think?


There was a post on Twitter comparing this question to the effects of calculators on mathematicians, which makes some sense.


I believe that AI will reduce the amount of code software developers write, but writing code is the least important thing we do. The most important is taking nebulous, sometimes contradictory requirements and restating them in such a way that it can be executed by a machine. In my experience, most people don't have the wherewithal or desire to do this. So, I'm guessing coding might go away soon, but software developers will be around for a long while.


Some hunches:

- AI will hypercharge us, but we'll still need to know what we're doing.

- Polyglots will be more common (since nailing syntax won't take nearly as long)

- Increased importance of understanding framework conventions, architecture and 'the system'

- More time debugging crappy AI-written code!

- Demand for software development will increase (not decrease) since time to market and cost will go down, making many more projects viable.


Programmer would be the last job to go but it's possible that the number of programmer jobs would start decreasing in the mean time.


> Programmer would be the last job to go

Nope. Experience has shown that tasks that require peak intellectual abilities in humans are actually very easy for computers to do. Computers were outperforming humans at calculation 80 years ago, and have been crushing grandmasters at chess since the 90s.

Meanwhile, controlling a robot to move efficiently, or reliably distinguishing everyday objects like cats and dogs, is still extremely challenging for current AIs, which require more data than any human could see in a thousand lifetimes to perform at a remotely adequate level.

It's "menial" jobs that will be the last to go. Because those rely on innate abilities grown over hundreds of millions of years of evolution. A task like programming that was only invented three generations ago is trivial by comparison.


I agree with you for the most part, however the fact that humans can determine a cat from a chair by being shown one cat proves that humans receive more data than some AI with sensors.

Perhaps I’m misunderstanding your statement.


I believe the fact that humans can distinguish a cat from a chair after being shown just a single cat actually demonstrates that humans have much deeper insight into what a "concept" is than current AIs do.

If sensor data were the problem, computers could easily outperform humans since we have sensors that generate much more detailed data than the human senses: High-resolution cameras, multi-spectral and thermal imaging, x-rays, radar, etc.

The actual difference is that when shown a picture and told "this is a cat", humans already know what to look for. Even if a human has never seen a cat before, they will not, for example, examine the background of the photo, or the floor the cat is lying on. They will also instinctively derive analogies from similar animals they already know, and deduce lots of correct information about that "cat" without needing to be told explicitly.


> The actual difference is that when shown a picture and told "this is a cat", humans already know what to look for.

Yes exactly. You’ll look for 4 legs, a tail, pointy ears and graceful movement. All of that is data you’ve registered by your (primarily one) senses (sensors). You’re receiving more data, and processing it faster, than a program.

Humans are fundamentally pattern matchers, and we’re great at it. What you call concept I call pattern.


Have you read Ray Kurzweil's how to create a mind? His definition of our mind is something similar along the lines. Pattern matching/ predictions are all that we do, according to him. I didn't think we'd come so far when I read it a few years ago. Agi seems like a real possibility now and I think I should consider developing some skills besides being able to write php.


> If sensor data were the problem, computers could easily outperform humans since we have sensors that generate much more detailed data than the human senses

I believe you vastly underestimate the amount of information the human brain processes continuously. Computers outperform humans by performing extremely narrow, focused computations at a high rate of speed. Despite years and years of research, I don't believe humans have even scratched the surface of understanding the human brain. In fact, I don't believe humans are capable of fully understanding it, since it was created by Someone so much greater than them.


I wonder if AI can do something like, here is a new programming language that has been written which provides better error handling and concurrency. Anyways, its the hot buzz now and everyone is using it. So do this for me, here is the source code of the language and its documentation.

Understand all that and create an app for me that does the work X.


> Will AI put programmers our (sic) of work?

Yes. The questions are: "How soon?" and "Which programmers?"

> Should I worry?

Are you satisfied doing a job that could be done by a simple script?

> Do you think that some form of AI will be able to do the job of an average programmer any time soon?

This is already the case.

Additionally, consider that so many programmers are so bad at their jobs that the average is dismal, and also that computer programming isn't that hard.

> what is your estimate?

I have a comment here on HN predicting it would occur by approximately last year, so we are slightly behind schedule from my POV.

> How would you try to AI-proof your career?

I wouldn't. The very idea is counterproductive. I want to maximize my effectiveness, not lock-in a dead-end "career" of make-work that could be done by a machine.

I think what you're really asking is, "How can I earn a living once I can no longer do anything that is more economically valuable than what machines can do?"

Personally, my answer to that is to change the entire economy to a Star Trek mode or something like that. Or go live in a cave in the woods. Not actually great options, eh?

But yeah, broadly speaking, if you as a working programmer could write a script to replace yourself I feel you're obliged to do it, eh?

Not only is that your job: to write software to solve problems, to economically benefit the company you're working for, but it's also the way to avoid becoming a "zombie", working a pointless job just for an excuse to take home a paycheck.

Stay home and go on welfare (along with everybody else), we can call it Universal Basic Income so nobody's ashamed that machines are better at everything than they are.


I think in this context, there exists a much more significant uncanny valley than any we know of right now. The usual example from visualization completely pales in comparison. There is no "almost" solution for any non-trivial task. And on this opposite side of the valley only humans can act.


>Do you think that some form of AI will be able to do the job of an average programmer any time soon?

No. AI is not a greater threat than stack overflow.

The skills of a programmer are not generating 20 lines of code to solve a well known problem. And even a 99% AI is next to useless, since finding errors is exceedingly hard.


I respectfully disagree with you, based on my personal use of Copilot. I find that having boilerplate code for making web service calls, database connections, etc. is an overall time saver. A factor of 10X? No. Obviously not a silver bullet, but a real time saver.

For a while I only had Copilot configured for VSCode and PyCharm, but I mostly use Emacs. The day that I took a little time to configure Emacs to use Copilot, it really hit me how useful Copilot - once I always had it available. Also, the ability to tab through multiple code completions lets me choose what I think is a good completion in a few seconds, or hit ESCAPE to discard them. I have been programming since 1964 (my Dad gave me access to a timeshared BASIC when I was a kid) so I can read code very quickly from almost 60 years of work and play.

I also find Copilot works well with my own bottom up, and REPL based development style.

I understand that many developers don’t like Copilot, but, we are all free to choose our own tools.

Anyway, I appreciate your comment even though my experience is different.


I totally believe you that Copilot is useful, my contention is that it is useful in the same sense that stackoverflow is useful.

No developer is competing with stackoverflow. These tools are enabling developers to quickly generate code for certain problems, which works especially well for boilerplate. But this isn't actually the main skill of a programmer, it is just some mechanical neccesity to writing software.

Much of what developers do is modifying existing code, fixing bugs, designing architecture and solving novel problems. If an AI could reliably do any of these tasks jobs would be endangered, but certainly that is not the case yet and AI would have to come quite a far way before that.


> I find that having boilerplate code for making web service calls, database connections, etc. is an overall time saver.

Why? If you setting up service calls correctly you have a client you instantiate and call some method on. Your service calling should be one line plus error handling (1-3 more lines). Databases are initialized once, again 2-3 lines of code.

If you think any of this is a time saver, or is difficult, your job is at risk from copilot. You are training your successor.


Well, I am in my 70s so job security is not really anything I think about.

Funny, but about 35 years ago I blocked my boss’s boss from buying a company that wrote an “AI coding tool”. I change my mind about things and what I found ridiculously simple and un-useful 35 years ago, is very different than Copilot.

I don’t think it takes a lot of imagination to fully conceptualize how much AI tools will change knowledge work.

I have been a paid AI practitioner since 1982 and I find it exciting how fast the field is now progressing. I worked as a consultant at Google in 2013 with their Knowledge Graph and that opened my eyes to the possibilities of so much structured and organized (they had a very good Ontology team) knowledge. Six years ago I managed a deep learning team at Capital One and mostly because of the strong team, I was surprised how effective deep learning is for practical problems.

One last example: in the 1980s I spent a fair amount of time trying to write code manually for anaphora resolution - a problem that BERT models now solve “simply.”


> Well, I am in my 70s so job security is not really anything I think about.

Many of us are not 70, so we have to be concerned about this. You seem gleeful, I’m concerned. Concerned for everyone about to lose their job, concerned for the pay drop those that don’t lose them will see. All around I see this as a bad idea but then people like you come in and push it forward.


It's hard to predict the future.

One aspect that's being slighly neglected is that programming isn't that special - tools like ChatGPT can potentially impact any kind of knowledge work. So it's not like there's a safe white-collar career track that you can easily move to.


I'd say it's more likely to impact legal industry or any kind of data based job. Programming is more on the compute side.


As a matter of fact, most software is produced in house tailor made for companies. Since AI is rising, this will make this software less costly, hence the demand for custom made software should increase, since this makes money for the company. So if you are doing something very simple in terms of AI capabilities, you might be in trouble, but otherwise you will simply be asked to do more work in the same time. In any case, the profession will change drastically in the near future, hence you should be alert and willing to learn.


Putting aside AGI types of actual "intelligence", this is evolution at work, more or less. As soon as someone finds a useful means of automating the "fitness function" testing that the output has desirable qualities (correctness or entertainment or whatever), everything outside the extremes will rapidly leave human hands. The extremes fall soon thereafter.

The problems are mostly sociocultural, and I am reminded of "Agent Smith" commenting on humans rejecting a Matrix simulation that didn't suck for them.


These tools are incredibly impressive, and they will probably only get better as time goes on.

However, most software being developed and maintained out there is not part of greenfield projects. It is new pieces being built upon the pieces that came before it. Years and decades of different styles, formats and convention.

Sure, in a few years these tools will speed up how quickly some parts of the UI or some internal logic will be created.

But I am not seeing these tools take over the biggest part of my job: connecting vastly different systems together.


I define a programmer as a human who mediates the gateway between the world of people and the world of machines. We are like lawyers hired to talk to genies on behalf of people who have been offered wishes.

AI is a dangerous and deceptive tool that requires wise and subtle nudging to make it work.

Results I have gotten from OpenAI are all terrible, for instance. I have to learn “prompt engineering” now. It just high level programming.


I like to phrase this as "We are the managers of the automated workforce."


That's too early to know for sure but for now I'm not that impressed yet. Yes sure those can be great tools for helping developers but replacing jobs? Maybe the most boring jobs possible could be automated with that and I'm not even sure.

Who will review and maintain the code produced apart from a developer anyways?

If you think about it it's much less revolutionary in terms of reducing coding jobs than WordPress.


When/if we reach that point of development, most other jobs will have been automated away by us, the programmers. Only fair, that ours gets automated away too then.

I think the true question is, whether we are going to treat human beings like trash, useless, now that their job has been automated away, or we as a society find a good way to deal with this and steer into a happier future.


I submitted an example[1] of a programmer trying to use ChatGPT to generate code for two different tasks: download and extract text from a PDF file; and then parse the PDF using regex etc.

It doesn't always work, and the reasons are interesting.

[1]:https://news.ycombinator.com/item?id=33940515


There are many other things you could choose to worry about:

    - nuclear conflict erasing 95% of humanity
    - deadly pandemic erasing 67% of humanity
    - devastating climate crisis erasing 80% of humanity
    - impact event erasing 100% of humanity

IMHO any of these events have higher probability when some AI taking over our jobs


As with all new technology it will change the jobs we do and how we do them. AI seems likely to have a large impact on jobs heavily involving communicating with computers. However is it likely that less people will be communicating with computers as a result; I would say no. Did the shovel make it less likely you would employ someone to dig?


> Did the shovel make it less likely you would employ someone to dig?

That's a limited analogy.

If you replace shovel with robotic excavator, it gets closer to what we have with current AI. It's not replacing jobs _yet_, but as soon as those excavators become fully automated, a single worker will be able to do the job of dozens, at a fraction of the time and cost.

And, yes, AI-powered excavators are a thing[1].

A closer analogy would be the trucking industry. Truckers are losing jobs _today_ as self-driving technology improves.

The same will eventually happen with software. Programmers will still be needed to drive the AI, but the productivity of one will be greatly increased, and human teams will be much smaller. Programmers won't be needed for simple tasks at first, until eventually only "prompt engineers" are left.

So I wouldn't say this is an existential risk yet, but our field will radically change within the next decade.

[1]: https://asirobots.com/mining/excavator/


My take is that it will decrease the "value" of developers by reducing the demand for them. If a company can be 10x more productive it will probably need less developers. if you can do your job 10x faster it will mean your team would less team members.

There's no shortage that is not followed by a glut..


> There's no shortage that is not followed by a glut..

There is a shortage of smart humans and has been for a long time. I've never heard of a glut of smart humans in history.


There’s no shortage of smart humans. Interesting how did you come up with this? Perhaps you are smarter than everyone else around you, and lack finding people on your level?


You’re not adressing how a comany would be 10x productive. You are just making things up.


I said, “If”. Please read carefully and don’t make things up. I am already writing code faster. And I am a business owner as well (90 people), so I know how team dynamics work.


No way. Just look at problems like translation. Its a nice tool but you cant rely on it to do a propper translation, it's actually very bad. How long have they been working on the translation problem?

Same with self driving. We get some nice tools to assist drivers but replacement won't come in my lifetime.


No..our work might morph into refactoring AI's spaghetti but a lot of a programmers job is comforting the person that hired them (whether in house or contract), knowing they have someone on hand to solve whatever problem may arise. This simply won't be the case with AI.


I don't think that's the case, at least for the near future.

One, it gives small blocks of code, that too, for the most common use cases. Two, the code often contains a few errors (doesn't compile) or has a few security vulnerabilities.


Nope. But one must learn to use these new tools to improve. Competition will not come from AI, it will come from other humans who leverage AI to increase output.


If anything, there have to be people in the loop to check balances (write good prompts to generate code and fix them) and take responsibilities for downtime.


Even if AI automates 90% of my work, I could just move full-time into products / business. Programming is just dealing with systems and abstractions.


Things for now just get more high level and less tedious.


Some programmers? Maybe. All programmers? Definitely not. In fact, AI has only created more job opportunities so far...


Most programming is just plumbing. AI will probably replace that.

That leaves algorithm research, and so we can then spend 100% of our time on hard CS.


This feels wrong in many different directions.

- Replacing plumbing work isn't trivial for AI at all.

- AI can also do algorithms research.

- Most developers aren't qualified or interested in doing algorithm research.

- There isn't that much demand for algorithms research, not enough to match the pool of developers anyway.


I think your first two points are in contradiction.

The last two points, I agree with.


It will replace the programmer in exactly the same way as Google and Stack Overflow did. Or like calculators replaced mathematics.


I found solutions like ChatGPT, Copilot, and CodeWhisperer too basic to replace a programmer.

But they can help with around 20% of the coding part.


(i) AI does not exist

(ii) already you can automate low hanging fruit with python + excel but its not done

It will increase their demand, if anything.


No, absolutely not. I think it will even boost the need for devops and frotend work.


If you ask most workers if their jobs are vulnerable to automation, they will say no


It will put everyone else producing ppts and excel sheets out of work first.


Kind of ironic that we are putting ourselves out of work.


If AI technology continues to advance and is able to take over most jobs, it will fundamentally change the way society functions.

In a capitalist economy, people are expected to work hard in order to produce goods and services, which drives economic growth. But if AI can do all of the work, there may be no need for people to work at all.

This could lead to a new economic model, one where prosperity is not tied to the labor of individuals. Instead, the focus could shift to ensuring that everyone has access to the resources and opportunities they need to thrive, regardless of whether they are working or not.


I couldn't agree more. I always thought that when AGI is achieved, it will change the fundamental paradigms of the economy, whether capitalist or communist. For example, Marx considers that value comes from human labor, according to the theory of value-labor (modifying nature, creating goods and services through intelligence in a conscious and deliberate manner). If an AGI replaces the human and can create value-labor, all of these theories will become false, paving the way for a new era, hope based on solidarity and equality of humanity with a new abundance of resources and better distribution.


Did compilers put programmers out of work?


No, neither compilers and interpreters did.


No, as soon as it's not open source.


the glue work will remain. Maybe we will become "AI whisperer"




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: