Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Working in big tech it's pretty wild to see how integral AI has become to our work internally, vs the public perception of it. People are NOT prepared.


1. Hyperbolic statement about LLM capabilities with no concrete examples

2. Wild claim that the companies that sell LLMs are actually downplaying their capabilities instead of hyping them


Personal experience here in a FAANG, there has been a considerable increase in: 1. Teams exploring how to leverage LLMs for coding. 2. Teams/orgs that already standardized some of the processes to work with LLMs (MCP servers, standardized the creation of the agents.md files, etc) 3. Teams actively using it for coding new features, documenting code, increasing test coverage, using it for code reviews etc.

Again, personal, experience, but in my team ~40-50% of the PRs are generated by Codex.


“Teams exploring how to leverage [AI]s for [anything]” is true for about a decade now in every large multinational companies at every level. It’s not new at all. AI is the driving buzzword for a while now, even well before ChatGPT. I’ve encountered many people who just wanted the stamp that they use AI, no matter how, because my team was one of the main entry point to achieve this at that specific company. But before ChatGPT and co, you had to work for it a lot, so most of them failed miserably, or immediately backtracked when they realized this.


Im sure the MBA folks love stats like that - theres plenty that have infested big tech. I mean Pichai is an MBA+Mckinsey Alumni.

Ready for the impending lay off fella?


There are places that offer Copilot to any team that wants it, and then behind the scenes they informed their managers that if the team (1+ persons) adopts it they will have to shed 10%+ human capacity (lose a person, move a person, fire a person) in the upcoming quarters next year.


Yup, he's totally lying. Not happening. Just carry on.


Agreed, but why are they lying?


That was sarcasm. He's not lying.


Didn't read any sarcasm in what he said?


Sorry, I meant my comment was sarcasm. I was being sarcastic. The original comment was sincere, I'm quite certain. And, they are right - there are some companies that really are getting a lot of value out of LLMs already. I'd guess that the more folks who actually understand how LLMs work, the more a company can do. There just isn't a neat abstraction layer to be had, so folks who don't have a detailed mental model get caught up applying them poorly or to the wrong things.


I've heard of one study that said AI slows developers down, even when they think it's helping.

https://www.infoworld.com/article/4061078/the-productivity-p...


AI may slow coding a bit but dramatically reduces cognitive load.

The real value of AI isn't in helping coding. It's in having a human-like intelligence to automate processes. I can't get into details but my team is doing things that I couldn't dream of three years ago.


It does dramatically reduce cognitive load. I think that part is understated and lost to the headline of how it writes two thousand lines of code in 30 seconds.


It is true sometimes, but other times it saves hours. We're all still in the learning stage of how best to use these new tools, and their capabilities are growing constantly.


Not prepared for what? Seems like the rest of the world is desperate to be shown the way to unlock something of value?


I think at this point it's software devs looking for the value unlock.

Non-software devs are actually making functional programs for themselves for the first time ever. The value is crazy.


It’s not the first time ever. People did the same with Access and HyperCard in the 90s.


Sure but I'm the real world do you think businesses are going to deploy piles of code into production generated this way? No, non technical people will continue to whip up MS PowerApps. AI generated code has no value to many businesses.


The value of AI is not in generating code. That's just a "nice-to-have."

The value of AI is in having a scalable, human-like decision maker that you can plug into anything, anywhere. This has unlocked countless use cases for my team, that we could scarcely imagine a few years ago.


"Human-like decision maker" except it's just as if not more unpredictable than a human, has no understanding of what it's actually outputting or the impact of it, and it isn't concerned with losing their job or facing legal repercussions for their actions.


There are plenty of ways to manage those drawbacks, and a mind-boggling number of use cases where it's "good enough" already.

But it's not my job to convince you, my lived experience working with the tech is enough to convince me, and that's all I care about, to be honest. Everyone else will get there sooner or later.


You don't need production level code to make your life easier.

You're missing the forest for the trees. Most people can't even make a block diagram, but they can explain what they have and what they want to do with it.


I think the market reveals itself. Perhaps you're right, but it's been years, and where's the value? No offense, and it might seem cool to build an app, but that's been possible for decades.


Not everyone has given in to the crutch.


That's why I still use an abacus.


The abacus skills are safely obsolete, the skills of general thinking and creativity must not become that. This couldn't be more specious.

Meme thinking like this, repeating something you've heard as reflex without regard to whether it fits a situation, is the exact kind of unoriginality we can't allow to become the default mode of thinking.


I am not the one being unoriginal here. You are thinking that AI will obsolete critical thinking, so there's no point developing with it.

However, in your moral crusade against using AI you are missing the big picture. No one is making you code with AI. But there are many things that you can only build if you use AI as a component.

The ability to plug a human-like decisionmaker into anything, anywhere massively expands what we can build. There are applications and use cases that you cannot even conceptualize without having the ability to plug AI in. This does not impacting critical thinking whatsoever.

Be original. Put your engineer hat on and think on what this new tool lets you build, that you couldn't beforehand.


I find the AI can make me more creative. I don't have to waste mental energy on boilerplate or straightforward stuff that would take me typing through some event processing loop etc. I can extract out and reuse components easier and focus on big picture design. Or build more bespoke admin tools that I wouldn't have wanted to waste time building some JS stuff before.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: