Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think you'd have to be stupid to expect productivity gains from your software developers using LLMs

edit: a lot of articles like this have been popping up recently to say "LLMs aren't as good as we hyped them up to be, but they still increase developer productivity by 10-15%".

I think that is a big lie.

I do not think LLMs have been shown to increase developer productivity in any capacity.

Frankly, I think LLMs drastically degrade developer performance.

LLMs make people stupider.





The thing you're likely missing is that you've forgotten what programming is at a high level.

A program is a series of instructions that tell a computer how to perform a task. The specifics of the language aren't as important as the ability to use them to get the machine to perform the tasks instructed.

We can now use English as that language, which allows more people than ever to program. English isn't as expressive as Python wielded by an expert, yet. It will be. This is bad for people who used to leverage the difficulty of the task to their own advantage, but good for everyone else.

Also, keep in mind that todays LLM's are the worst they'll ever be. They will continue to improve, and you will stagnate if you don't learn to use the new tools effectively.


> you will stagnate if you don't learn to use the new tools effectively

I've been going the other way, learning the old tools, the old algorithms. Specifically teaching myself graphics and mastering the C language. Tons of new grads know how to use Unity, how many know how to throw triangles directly onto the GPU at the theoretical limit of performance? Not many!


I did some of that when I was younger. I started with assembly and C, even though everyone told me to skip it and start with at least C++ or something further up the abstraction ladder. Ignoring them and gaining that knowledge has proven invaluable over the years.

Understanding a "deeper" abstraction layer is almost always to your advantage, even if you seldom use it in your career. It just gives you a glimpse behind the curtain.

That said, you have to also learn the new tools unless you tend to be a one man band. You'll find that employers don't want esoteric knowledge or all-knowing wizards who can see the matrix. Mostly, they just want a team member who can cooperate with other folks to get things done in whatever tool they can find enough skilled folks to use.


I think this guy is smarter than every LLM user in the thread

> you will stagnate if you don't learn to use the new tools effectively.

This is the first technology in my career where the promoters feel the need to threaten everyone who expresses any sort of criticism, skepticism, or experience to the contrary.

It is very odd. I do not care for it.


"you will stagnate if you don't learn to use the new tools effectively."

this hostile marketing scheme is the reason for my hostile opposition to LLMs and LLM idiots.

LLMs do not make you smarter or a more effective developer.

You are a sucker if you buy into the hype.


Are you arguing that you can work in technology without learning new things?

Have you considered a career in plumbing? Their technology moves at a much slower rate and does not require you to learn new things.


No... nobody has ever argued that.

There's a debate to be had about what any given new technology is good for and how to use it because they all market themselves as the best thing since sliced bread. Fine. I use Sonnet all the time as a research tool, it's kind of great. I've also tried lots of stuff that doesn't work.

But the attitude towards everyone who isn't an AI MAXIMALIST does not persuade anyone or contribute to this debate in any useful way.

Anyway if I get kicked out of the industry for being a heretic I think I'll go open an Italian restaurant. That could be fun.


> There's a debate to be had about what any given new technology is good for and how to use it

Fair enough. It's reasonable to debate it, and I'll agree that it's almost certainly overhyped at the moment.

That said, folks like the GP who say that "LLMs do not make you smarter or a more effective developer" are just plain wrong. They've either never used a decent one, or have never learned to use one effectively and they're blaming the tool instead of learning.

I know people with ZERO programming experience who have produced working code that they use every day. They literally went from 0% effective to 100% effective. Arguing that it didn't happen for them (and the thousands of others just like them) is just factually incorrect. It's not even debatable to anyone who is being honest with themselves.

It's fair to say that if you're already a senior dev it doesn't make you super-dev™, but I doubt anyone is claiming that. For "real devs" they're claiming relatively modest improvements, and those are very real.

> Anyway if I get kicked out of the industry for being a heretic I think I'll go open an Italian restaurant.

I doubt anyone will kick you out for having a differing opinion. They'll more likely kick you out for being less productive than the folks who learned to use the new tools effectively.

Either way, the world can always use another Italian restaurant, or another plumber. :)


I'm arguing that LLMs are overhyped garbage which frankly seem like a dead end for someone pursuing a career in software development

How old is your career then? I've been hearing some variation on "evolve or die" for about 30 years now, and it's been true every time... Except for COBOL. Some of those guys are still doing the same thing they were back then. Literally everything else has changed and the people that didn't keep up are gone.

I've seen interns (with academic background) build advanced UIs for projects while not having a background in coding. This would not have been possible without LLMs.

Can they do it without LLMs?

If they can't, did they really do it in the first place?

Are they actually literate in the programming languages they're using?


I don't write any front-end code at work anymore. I use Figma MCP and Cursor and it can implement the design near perfectly on first try.

To be fair, this is presumably because a skilled human spends time properly making the design for you in Figma.

It doesn't really matter how "skilled" the designer is. Figma's MCP already provides HTML and CSS that's basically ready, and all AI needs to do is translate that into React or whatever. Or if you mean that AI wouldn't be able to make a proper interface without the human, that's also not true. The only reason I use Figma MCP is that my company uses Figma and has a dedicated Figma person. My opinion is that is just a bottleneck, and it would be easier to prompt AI to make whatever interface.

> The only reason I use Figma MCP is that my company uses Figma and has a dedicated Figma person. My opinion is that is just a bottleneck, and it would be easier to prompt AI to make whatever interface.

Here's where our opinions differ - I think replacing that Figma person with AI prompts will negatively affect product in a way that is noticeable to the end-user and effects their experience.

It does of course depend what kind of product you're making, but I'd say most of the time this holds.


I'm not even arguing that you should replace the Figma person with AI. I am arguing that even without AI, having Figma persons is a bottleneck. It is much faster to just use some kind of component library, like shadcn, and let the developer figure it out. And with AI that would be even faster, as the developer wouldn't have to write any code, just check the AI output and prompt to make changes if needed. Unless of course you need one of those really fancy landing pages, but even then, you would likely need a specialized developer, and not a Figma person.

If you work in B2B SaaS, sure, I guess. That's a lot of HN by virtue of being a lot of SF VC, but only a tiny part of all tech. Elsewhere shadcn isn't a realistic option.

I'm curios, where is "elsewhere"?

..literally everything that isn't a recent, up and coming B2B SaaS. So >90% of the software written today.

To give but one example, effectively all of the >$300B mobile app market. Or all enterprise software that can't run on Electron. Or any company that cares about image/branding across their products, which is every single company past a certain size (and don't come at me with "but hot AI startup uses Shadcn and are valued at X trillion").


I will come at you at say that >90% of software written today is garbage, and >90% of companies are run by incompetent people. My hypothesis is that is the reason why we "need" Figma persons and project managers.

My mom could do that with dreamweaver 25 years ago.

This is such a tired argument.

Could people write scientific code without python? If they can't, did they really do it in the first place?

Could people write code without use after free bugs without using a GC'd language? If they can't, did they really do it in the first place?

Could people make a website without WYSIWYG editor? If they can't, did they really make a website?


I think LLMs have aggressively facilitated the rise of illiteracy in people attending software development university programs.

I think graduates of these programs are far, far worse software developers than they were in the recent past.

edit: i think you mean "irrelevant", not "irreverent". that being said, my response is an expansion of the point made in my comment that you replied to.


> I think LLMs have aggressively facilitated the rise of illiteracy in people attending software development university programs.

But this subthread is about interns who did not study CS, and are able to create advanced UIs using LLMs in the short time they had left to finish their project.


I'll start by saying that this seems irreverent to my previous comment.

That being said, I half agree but I think we see things differently. Based on what I've seen, the "illiterate" are those who would have otherwise dropped out or done a poor job previously. Now instead of exiting the field, or slowly shipping code they didn't understand (because that has always been a thing) they are shovelling more slop.

That's a problem, but it's at most gotten worse rather than come out of thin air.

But, there are still competent software engineers and I have seen with my own eyes how AI usage makes them more productive.

Similarly, some of those "illiterate" are those who now have the ability to make small apps for themselves to solve a problem they would not be able to before, and I argue that's a good thing.

Ultimately, people care about the solution to their problems, not the code. If (following the original anecdote) someone with an LLM can build a UI for their project I frankly don't think it matters whether they understood the code. The UI is there, it works, and they can get one with the thing that is actually important: using the UI for their bigger goal.


Does it matter?

Do you think IDE's, type checking, refactoring tools and autocomplete make developers stupider too? Serious question.

not at all, I think these are valuable tools

would you agree that LLMs make developer stupider?

edit: answer my question


So what about Cursor's tab autocomplete? Seems like there is a spectrum of tooling between raw assembly all the way to vibe coding and I'm trying to see where you draw the line. Is it "if it uses AI, its bad" or are you more against the "hey build me something and I'm not even gonna check the results."

... they are not going to give you a satisfying answer to your totally reasonable line of inquiry.

Looking at the brief history of their account, I don't think anything they are saying or asking is in remotely good faith.


Can you describe anything about the difference between using ChatGPT with GPT-4o to write code in January 2025 and using Opus 4.5 with Claude Code in December 2025?

[flagged]


This isn't the smart, snappy reply that you believe it to be.

As a comment reader this exchange with Simon translates directly to "no, but you have forced me to try and misdirect because I can't reply in good faith to an expert who has forgotten more about LLMs than I'll ever know".


what reads to you as "an expert who has forgotten more about LLMs than I'll ever know" reads to me as a crack cocaine smoker.

just write the code


The only person coming off as unhinged and out of touch with reality here is you.

Yeah, I don't think you've used any of the technology you are criticizing here.

well you don't have to "think" anything lol, if you've never tried it yourself that's step one. step two is to not assume everyone out there is a shill or lying because that's awful convenient. also its not black or white.

developers can exist in a small team, solo, large enterprise all with their mandates and cultures so just saying LLMs increase/decrease is reductive.

have a feeling i'm being trolled tho.


I'm not trolling, this is my sincere opinion

I think LLM addicts are particularly susceptible to flattery.


You're alienating a lot of people just by calling them "LLM addicts," and I suspect you're not arguing in good faith given your language.

There are a lot of sad people who have developed parasocial relationships with ChatGPT, but that's entirely a separate issue from whether agents are a good tool for software engineering.


"Real" woodworkers often have similar reactions when they see people incorporating CNC machines into their creative process as you appear to have when it comes to your engagement style on this topic.

They don't emerge looking credible, either.


equating a CNC router with Claude Code or Chat GPT is an egregious false equivalency.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: