Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Microsoft has been driving users away for no less than 10 years now and nothing much is changing IMO because Windows is what most people think of as "the computer". Non-technical users think they have no other choice, they shrug the defects off and try to get their job done.

Though the article kind of piqued my interest by claiming that Microsoft "is in panic" -- any proof that they are indeed bleeding users more than what is deemed normal?

I have noticed people around me get recommended Linux Mint and being happy with it but I'd think that's a minority; many offices rely on bespoke local network setups with printers and scanners and people putting files in shared directories and what-not... modernizing that and making it better is no easy feat.

EDIT: Oh, and as other users are saying, Linux desktop is absolutely not going to eat Windows' lunch. There are still too many confusing things there, and "you can hack it and make it your own" appeal to an infinitesimally small audience.



> any proof that they are indeed bleeding users more than what is deemed normal?

They are, but not to other OSes. More and more people associate computers with something you use for work (and work provides) or for school (and school provides). All personal computing happens on a phone.

My partner, for example, uses her personal laptop about once a year to do taxes. Otherwise it sits in the closet.

Even I use an iPad for most personal work these days. Laptop is mostly for personal coding projects for which I have less and less time as life adultifies.

I know a lot of people who don’t even have a personal computer anymore and just use the work laptop the few times their phone isn’t good enough.


> All personal computing happens on a phone

This makes me sad, and is scary, etc. However, putting emotions aside, can we extrapolate from this? Could the next "killer app" actually be an even tinier device? A smartwatch? Or have we reached a local extremum (phones)? And what does that say about attempts to regress towards larger screens (AR/VR headsets)?


Phones dominate not because of screen size but because the operating systems they run are fundamentally more user friendly. iOS/Android were the first time experienced OS engineers were able to go back to the drawing board and rethink the whole stack from scratch. They learned a lot of lessons from what worked and what didn't on generalized desktop computers and were able to build something people prefer using.

That's a rare opportunity. I hope someone gets a chance to do it for professional workstation computing one day. The Windows codebase has IMHO reached end-of-life. The Linux community isn't going to stray far outside the bounds of the 1970s era UNIX design. Apple does OK but they can't/won't change the macOS UI in any meaningful way and not enough people write native Mac apps to justify trying to do anything interesting there. ChromeOS is basically just a large screen smartphone OS experience with some Linux virtualization on the side. We're caught in a local minima.


It's mostly screen size... Life happens, and of the two, the phone is the one with you. Wherever I go, i carry my phone, but not my laptop. I take pictures with it, so they're stored there. I take screenshots, and share stuff with friends, gets saved on my phone. I browse on the phone on the bus/subway wherever. Browser history gets saved there, opened tabs, etc.

Bringing photos, saved stuff to computer all takes an extra step. For most of the time I need a digital device, the phone is good enough, and most importantly - it's with me. Not just when i get home.

When high productivity is needed, then yes, laptop is king. But if somehow we could have the same productivity on a device, 10x smaller that I can carry with me, why wouldn't I start doing everything on that device? That's why smartphone is king.


Really, among people under 25, a lot of them see "the phone" as the computer, and treat using desktops as a burden any time it happens.


This i find fun to watch as they sit down at a desktop and co work together through the desktop interface.


Can you share any anecdotes, or some more details? I've been watching this "app generation", and I've read Prensky's nonsense about "digital natives", so I'm always interested in concrete examples of how it's playing out. Thanks!


For example, students now regularly turn up at universities unable to use files or folders. They have to be given basic IT lessons before they can start to work with data.


It's not just them. Even software engineers under 35 don't use desktop computers any more, except at work. At home, for personal use, they just use phones, unless they're into the latest video games.


The netbook I kept referring to in some comments has finally died, a tablet has replaced it for all pratical travel purposes.


> unless they're into the latest video games.

The fact that there can exist sizable population of software engineers that are not into video game is disturbing in itself.


There's nothing disturbing about not being interested in the latest ultra-violent and ultra-realistic FPS. Similarly, there's nothing disturbing about not being interested in the latest Hollywood movies, which are all Marvel comic-book movies and uninspired Star Wars franchise installments.

Personally, I love playing some classic video games, such as through MAME, once in a while, but new games are nothing like this.


The appeal of the Linux desktop for the majority of people is that it runs chrome or Firefox just fine and is pretty much secure if you do your updates. That's how the majority of people use their computers. The underlaying OS is irrelevant as long as its not actively nagging it's users.


They better not try to enjoy YouTube though.


Huh? Any issues with YouTube on Linux?


Depends on how lucky one feels getting VAAPI to work with their open source driver.


You don't have to use open source drivers. I only use them with intel CPU graphics where they work great IMO. AMD and Nvidia ship drivers, most distros somehow pack them. No issue there.


I know, that it the usual answer since I started with Slackware Linux in 1995, it always work for someone else.


We don't have to go into details but I really wonder what kind of bad experience you had. I use Linux for ~20 years now and literally wasted weeks getting graphics to work the way I want. I know all about these pains.

However this was a long time ago. OS drivers were shit, official drivers horrible to install and maintain. Today all end customer facing Linux flavours feature a button do directly enable proprietary drivers on installs and just have them updated and maintained in your update routine.

There is literally zero friction other than a checkbox in the installer for a majority of setups.


Now try that on a random laptop with dedicated GPU.

It isn't as if I never used any other distribution since 1995, in fact I probably have used more in numeric value than the average age of HNers, just to use a random metric.

Hence why my Linux based media devices nowadays are Android and WebOS powered instead, I leave GNU/Linux for servers.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: