In articles like this I’m always surprised that author did not take 5 minutes to think that well, maybe I’m living in a bubble and there’s lots of people that are actually excited about AI.
I read more sceptical takes about AI on Hacker News than anywhere else (since I stopped following Gary Marcus, at least). My hunch is that some people here might feel professionally threatened about it so they want to diminish it. This is less of an issue with some of the 'normies' that I know. For them AI is not professionally threatening but use it to translate stuff, ideate about cupcake recipes, use it as a psychologist (please don't shoot the messenger) or help them lesson plan to teach kids.
> My hunch is that some people here might feel professionally threatened about it so they want to diminish it.
I don't think it's this. At least, I don't see a lot of that. What I do see a lot of is people realizing that AI is massively overhyped, and a lot of companies are capitalizing on that.
Until/unless it moves on from the hype cycle, it's hard to take it that seriously.
Speaking as a software engineer, I'm not at all threatened by it. I like Copilot as fancy autocomplete when I'm bashing out code, but that's the easy part of my job. The hard part is understanding problems and deciding what to build, and LLMs can't do that and will never be able to do that.
What I am annoyed by is having to tell users and management "no, LLMs can't do that" over and over and over and over and over. There's so much overhype and just flat out lying about capabilities and people buy into it and want to give decision making power to the statistics model that's only right by accident. Which: No.
It's a fun toy to play with and it has some limited uses, but fundamentally it's basically another blockchain: a solution in search of a problem. The set of real world problems where you want a lot of human-like writing but don't need it to be accurate is basically just "autocomplete" and "spam".
I disagree with the characterisation of AI as "another blockchain: a solution in search of a problem". The two industries have opposite problems: crypto people are struggling to create demand, AI people are struggling to keep up with demand.
> crypto people are struggling to create demand, AI people are struggling to keep up with demand.
Today, 10 years ago crypto was what everyone wanted, you can see how bitcoin soared and crypto scams was everywhere and made many billions.
And no AI is not struggling to keep up with user demand, it is struggling to keep up with free but not paid demand. So what you mean is AI is struggling to keep up with investor demand, more people want to invest into AI than there are compute to buy, but that was the same for bitcoins, bitcoin mining massively raised the prices on GPU due to how much investors put into it.
But investor driven demand can disappear really quickly, that is what people mean with an AI bubble.
Google has built a multibillion dollar business on top of "free" users. ChatGPT has more than 400 million weekly active users and this is obviously going to grow. You are overlooking how easily that "free" demand will be monetized as soon as they slap ads on the interface.
"Obviously" is doing a lot of heavy lifting there. They have a lot of competition and no killer use case and no IP rights. From the consumer point of view, they're fungible.
That's not even considering the probability that demand could slow as people lose interest.
Yep. Ten years ago HN was hyping blockchain as the future and there were a million blockchain startups. Just look at the list of startups funded by YC around then lol
HN is a highly technical audience, and AI is showing the most benefit on highly technical tasks, so it seems logical to me that HN would be more excited than "the real world". (What is the real world, btw? Do people on HN not exist in the real world?)
My sister, who is a pretty technical kinesiology PhD student, does not know how to input Alt+F4 and insists that is esoteric knowledge. There's a litmus test for how out of touch HN users may be with the way normal people use computers.
I don't think so. None of them seemed to be a tech enthusiast before, if you don't consider using social media a trait of a tech enthusiast.
I think people who are interested in how things work and AI users are two entirely different cohorts. ChatGPT from user's perspective is more like a search engine or an autotranslator rather than some sophisticated technical gizmo.
Is that true? I have three kids now, two of them in high school, that are perhaps more AI-savvy than me (both good and bad). I think the article, and my limited professional view, is informed by SoftwareDev, IT infrastructure and Enterprise technology. I think a lot of younger people are happily plugging AI into their life.
ChatGPT is the number one free iPhone app on the US App Store, and I'm pretty sure it has been the number one app for a long time. I googled to see if I could find an App Store ranking chart over time... this one[0] shows that it has been in the top 2 on the US iPhone App Store every month for the past year, and it has been number one for 10 of the past 12 months. I also checked, and ChatGPT is still the number one app on the Google Play Store too.
Unless both the App Store and Google Play Store rankings are somehow determined primarily by HN users, then it seems like AI isn't only a thing on HN.
Close to 100% of HN users in AI threads have used ChatGPT. What do you think the percentage is in the general population, is it more than that, or less than that?
I was at a get-together last weekend with mostly non-tech friends and the subject was brought up briefly. Seemed to be a fair amount of excitement and use by everyone in the conversation, minus one guy who thought it was the "devil"...only slightly joking.
If I were to write a "hard" sci-fi story of how the devil might take over the world in the near future, AI would be my top choice, and it would definitely fit with The Usual Suspects' "The greatest trick the devil ever pulled was convincing the world he didn't exist".
It's because we're excited about the possibilities. It's potentially revolutionary tech from a product perspective. Some claim that it increases their speed of development by a not insignificant amount.
The average consumer does not appear to be particularly excited about products w/ AI features though. A big example that comes to mind is Apple Intelligence. It's not like the second coming of the iPhone, which it should be, given the insane amount of investment capital and press in the tech sphere.
I don't know, I know many people (including non-technical people) that use a lot of the chatbots. (And I even heard some parents at the playground talk about it to each other. Parents that I didn't know, it was a random public playground.)
Not sure if they are 'excited', but they are definitely using it.
Exactly. People are actually paying to use ChartGPT. 10 Millions subscribers and 1 million in Business and Enterprise. Number 1 in Productivity Download on App Store. My 10 years old nephew are using ChartGPT to do all sort of things, and she told me her whole class are using it. I have heard a few real life conversation about ChartGPT being what Google ( as in search engine ) should have been all along.
And these people dont know a thing about C, Java, CPU or RAM. They are not tech people.
Over the decades the moment I hear real life conversation from non tech people in public talking about certain tech and being somewhat enthusiastic about it, is the moment that piece of tech has reached escape velocity. And it will go mainstream. And somewhat strangely enough, I only started using more ChartGPT because every non-tech people are starting to use it. And they use it much more than I do.
Just like people laughed about "Smartphone" as in iPhone era. Lots of tech people including i believe MKBHD only got their first Smartphone with iPhone 4, most consumer are even later. While I have watched the introduction of iPhone Keynote a dozen times before the thing was even shipped. The adoption curve of any tech will never be linear.
The example pointed out at the start of the article is somewhat bizarre. AWS is only pausing Colo. Apple Intelligence has more to blame with Apple themselves rather than AI. Intel ( or PC ) not selling AI enhanced chip is because consumer dont buy AI hardware, they buy AI functions. And so far nothing on Windows OS seems to be AI enhanced and specifically requires the AI Intel CPU.
And I am not even Pro AI or AI Optimist to see all that.
Amazing. Presented with a study that found that the general public isn't excited about AI-enshittifying everything around them, you ask "What if people who aren't excited about AI are in a bubble?"
I think you should read the study before jumping to this conslucsion. The fact that people are not excited about incorporating AI in shitty way into some apps does not imply that people are not excited about AI.
Having a smart-but-no-smartass intern by my side, that is always eager to help, that is at the very least superficially knowledgeable about most things, that autonomously gets better at absolutely everything non-physical (yet), that is loyal yet immediately replaceable should I get bored with it or should a better intern pop up overnight (literally happened yesterday with qwen3), that never tires or gets annoyed at me.... well that's pretty exciting.
What a weird and deshumanizing thing to say. Slaves are definitely people, a machine definitely isn't.
Moreover, slaves aren't productive over the medium/long term. You don't get useful and lengthy performance from raw coercion, the same way you don't get truthful and actionable intelligence from torturing prisoners.
So no, an LLM has very little in common with a slave.
Google Gemini for what used to be Amazon Alexa tasks, chatgpt image filters (that Studio Ghibli one?), YouTube AI channels putting out content like If Danny Devito was the little mermaid, if Linkin park sang song X... Etc..
The novelty wears off quick, though. I think it's really technically interesting and fascinating that these models can produce what they produce. I love learning about them. But as far as the videos themselves and such, even with that interest, I know that whoever produced the video didn't do much. That doesn't make me feel really engaged with it. It feels discardable. I was watching a claymation dark fantasy piece someone put together recently, shot on their iPhone, which required a lot of work. They are an amateur, but did a good job. And I felt a lot more engaged with it in its jankiness than any of the AI produced videos I've seen. I still think about it from time to time. All the AI entertainment is momentarily interesting at best but I don't think about it much afterward.
No one is going to get a billion dollar investment due to that. It's why all the corporate speak and marketing is harping on about productivity, robot takeover and deus ex machina in corporate language.
Normal people will use it for creative writing aid, scrapbooking and other extremely unprofitable non-technical stuff.
I mean those were the items my non technical friends shared with me so far...
Personally - i was able to get mockups of interior designs based on the photo of a building under construction using chatgpt - this would've cost me both time and money if i went to a real designer.
Gemini has been summarising meetings i could not attend (scheduling conflicts etc...) and saving me from hours of watching meeting recordings.
I was really skeptical of things because of the horrible results I've had 2 years ago with copilot and chatgpt, but things have improved drastically. To the point that it's already empowering certain people/jobs while having the opposite effect on others.
Is it perfect? Nope. The mockups did have weird glitches. But they were 75% there and good enough for the task I wanted. The meeting notes were as good as a real human.
So it's definitely eroding more of these kinds of jobs and so we are
AI excitement is all supply side. Lots of people are excited to automate their own labor and smooth out production. Very few people want to accept raw AI generated slop.
That isn't pure doomerism - there's plenty of room for AI assist, and people like using AI experiences themselves. AI as a product is here to stay, but the second order of products openly using AI is showing it's limits.
No, that's not meaningless, I've tracked my time and turned out I've spent much more time than I thought on one website that only made me angry, I've blocked it and saved myself hour a day and don't miss the website at all. So yeah, from my experience tracking can be very useful and meaningful.
efficiency is for machines
if your grinding when you have to, then it makes no difference what when where, and how much you put into unwinding
anger is good, if it's not disfunctional, and directed at actual bad things that effect you now, anger is your signal to, grind, right through the objectionable thing, if its remote like the web page, then any amount of effort is waste, and self harm
so grind, or luxurate in the knowledge that, whew!
no more grind power left just now
timing is for refining your thing, and streamlineing, but if you got no thing
grind
it's meaningless to optimise, treading water, especialy when there is a beach resort that you can see and hear
I would reverse the whole thing and say that if you were to time your very best moments of work, or creativity, you will find them to be shockingly brief, and remembering the specific conditions that led to those moments, is the thing to focus on and foster.
baring that
grind
Ukraine already surrendered chunk of its territory few years ago. You think this time it will be different and Russia won't try to take Ukraine again once it rebuilds its military potential?
No of course I don't. Russia will certainly try again. Ukraine is absolutely right to not want to do this! I'm just pointing out what Trump's motivation is.
The difference is that in the future (assuming the mineral deal goes through), there would be US citizens operating mineral franchises on Ukrainian territory. So if Russia harmed them in the future, we would be drawn into an actual war.
There were US citizens and businesses in Ukraine last time. Russia will just go around them like they did then. Having some US businesses will provide zero protection.
Assuming mining those minerals would actually make economic and strategic sense for US companies, considering significant long-term investments require stability. And that a significant US workforce would even be required for that. And that the US administration doesn't just make a bargain with Putin about leaving these mining operations alone while doing whatever the fuck else they want.
Could Ukraine maneuver around Trump by instead signing a minerals deal with UK+EU? Better to give the $600b (optimistically) to friendly allies. The problem is UK+EU does not have equivalent defense contractors. US gets what it wants as well, by disconnecting from the conflict.
> Name one American who would volunteer to fight because a Thai ship got attacked by Yemeni rebels while travelling through an Ethiopian straight to an Egyptian canal.
Apparently, the entire US Navy was okay with that volunteer assignment.
reply