Hacker News new | past | comments | ask | show | jobs | submit login

> Linux desktop is getting better but I still wouldn't daily drive it,

I'm genuinely interested what Linux is missing for you? I've been daily driving it for years and do all my work and gaming on it. Is it specific software or?






It's just general polish. Like I was daily driving fedora last year and :

- fractional scaling did not work in Gnome with Wayland for X11 Apps

- I still cannot use my LG C4 as a monitor in full capacity because AMD on Linux does not support HDMI 2.1

- Screen sharing was very buggy - in Slack especially - it would constantly crash the slack app during calls, ditto for camera, but even in Google meet and Chrome I've had desktop crashes

- When I switched to KDE/Plasma 5 to get fractional scaling it was extremely unstable

- Right now I upgraded my GPU to 9070XT - I'm still not sure if that would work on Linux yet because of driver support delay

- Guitar Amp simulator software I use does not support Linux, neither does Ableton (which supposedly can run on proton but with many glitches)

- The audio DAW situation was way too complicated and buggy

- I spent days to get the distro functional and usable with Ardour and it would still crash constantly - I just wanted to run some amp sims :(

It's just the little things and rough edges, but for example the fractional scaling stuff already improved because more apps that I use added Wayland support. And the emulation is getting better, with more users I could see larger DAWs supporting Linux as well. Not sure about the audio progress - JACK was a complete mess.


> Right now I upgraded my GPU to 9070XT - I'm still not sure if that would work on Linux yet because of driver support delay

You can install AMDs driver from their repo directly, it works just fine (using it every day).

> I still cannot use my LG C4 as a monitor in full capacity because AMD on Linux does not support HDMI 2.1

That will never be possible. To prevent pirates from breaking it (lol), HDMI has decided to keep HDMI 2.1 secret. No open source version of HDMI 2.1 can exist.

That said, AMD's driver repo includes both the open source drivers and some proprietary versions of the driver, maybe that'll work for you.

Another option would be using a displayport output and a DP to HDMI converter, as e.g. Intel is using for their GPUs.


- Fractional scaling: That's because X11 itself does not support it. Many older Windows apps also have problems with fractional scaling.

- HDMI 2.1: The HDMI Forum blocked it, as they don't want the details of HDMI 2.1 publically available. If you can, use DisplayPort, which is an actual open standard, and is better anyway. Nvidia works because they implemented it in closed-source firmware instead. https://www.phoronix.com/news/HDMI-2.1-OSS-Rejected


> That's because X11 itself does not support it.

Strangely enough Plasma was able to handle this regardless (guess it was misreporting the resolution to X11 app or something like that to make it work ?) it was a Gnome/Wayland thing.

DisplayPort isn't an option - the TV only has HDMI in and converters suck (they crash constantly, even the expensive ones)


You can also buy active DP to HDMI 2.1 adapters now - if you already have an HDMI KVM for instance. Cable Matters makes one.

If you're happy to dip your toes into another DAW, Reaper has excellent first-class Linux support, works with all your plugins, and has a 60 day trial* for you to get used to it.

* The free trial is enforced as heavily as WinRAR's, and it's pretty cheap (~$60) to buy a licence if the nag screen makes you feel bad enough


I tried that first but had trouble getting it to launch so I decided might as well goo with the OSS option. Boy was I in for a fun ride with getting the whole jackd and audio subsystems running.

The problem is not only the DAW support, but the support of low-latency audio interfaces in Linux. Audio interface makers rarely create a Linux driver, and a low-latency setup on Linux is its own hell, with real-time kernel patches. On MacOS and Windows, it works out of the box.

rt patches are upstream since about a year ago. You might need to swap to rt Kconfig, sure, but not patches.

Alternatively Bitwig has Linux support and wouldn't be such a big jump.

> When I switched to KDE/Plasma 5 to get fractional scaling it was extremely unstable

KDE Plasma 6 made major improvements and has excellent fractional scaling, the best I've seen in a Linux desktop environment and comparable to scaling in Windows 10-11. I encourage you to give it a try.


Sorry I misspoke - I was using Plasma 6, as that was the only way to get fractional scaling in X11 apps

you can use Carla in linux to run windows VSTs, i do it all the time. Works great. Midi and audio routing is also quite good. Ableton also runs with Wine.

Our 12 year old recently switched from Windows to Ubuntu…

and now I’m constantly getting these complaints “I can’t get screen capture to work under Wayland… I switched from lightdm to sddm and I can’t work out how to switch back… I accidentally started an i3 session and I can’t work out how to log out of it.”

It makes me kind of miss Windows, in a way. It is good he’s learning so much. But the downside is Linux gives him lots more ways to break things and then ask me to fix them for him. And a lot of this stuff I then have to learn myself before I can fix it, because most of my Linux experience is with using it as a server OS, where desktop environments aren’t even installed


Well, don't help him. People(me) grew up without the Internet or Smartphones and broke Windows on the family PC all the time. In 2000 when I got SuSE it only slowed down the breakdowns. He can always fix stuff himself by reinstalling the OS. As long as he doesn't format the /home partition he will not lose data. And he will learn his lessons.

12-year-old me installed Linux on an old desktop tower and I also broke things constantly. The difference is my parents were both humanities majors and I knew full well there was no point in asking them for help. Even at the time, the resources were all there for me to teach myself to Linux. Sure, I spent many many hours troubleshooting things instead of doing whatever it was I had as my end goal, but I was a kid—learning is the point!

It's harder as a parent to know that you're capable of solving their problem and still say no, but by age 12 that's pretty much your primary job: to find more and more things that they can start doing for themselves, express your confidence in them, and let them figure out how to adult bit by bit. Breaking a Linux install and fixing it again is among the lowest stakes ways that dynamic will play out from here on.


> Ubuntu

Well, there’s your problem ;-)

This is great, though, really. I broke our computer so many times growing up, I couldn’t possibly count. I don’t think I ever lost anything of import, other than some savegames of mine. I keep telling people who ask, “how do I learn Linux?” that they need to use it, tinker with it, break it, and fix it, ideally without anything other than man pages and distro docs. It is a shockingly effective way to learn how things work.


There is more to learn / do than anyone has time. My kid is supposed to spend an hour on his violin, half an hour an fitness, then some time on chess, then eat - including clean up and/or cook. somehow he needs to fit some free play in too. He doesn't have time for more.

It isn't that he could do that, but what else to give up?


I think for a lot of us that learned linux this way, it was firmly in the "fun time" category. We would have rather been tinkering than most other things.

I'd say that screen capture probably works under X11 no matter what your graphic card is. However this kind of confirm your general feeling: there is no only one blessed and enforced way to do things so everything can break because of combinations.

Examples (I've been on desktop Linux since 2009): shutdown actually reboots except for a few months with some lucky combination of kernel and nvidia driver. The brightness control keys didn't work for at least half of the years. They currently work. All of that has workarounds but I understand that some people legitimately fold and go using another OS.


> i3 session

Oh he'll figure it out eventually. This kid might be going places.


My Linux desktop experience...

I started with Linux installing it from floppy disks in about 1996.

In 1995, I was back on Windows 95 within a week because I needed to get something done.

In 2000, I was back on Windows 2000 within a week because I needed to get something done.

In 2005, I was back on Windows XP within a week because I needed to get something done.

In 2012, I was back on Windows 7 within a week because I needed to get something done.

In 2015, I was back on macOS within a week because I needed to get something done.

In 2020, I worked out I'm wasting my time on this.

I watch my colleagues and friend struggling with it. Lots of small papercuts. Lots of weirdness. Lots of regressions. Plus many years of server-side experience says to me "I should probably just use FreeBSD" in that space.


I've wasted like 8 hours in the last two days trying to upgrade windows 10 to 11 so my motherboards wifi drivers can be installed.

It just worked in Linux. I don't get where this comes from, because every time I hit a problem in Linux, there's a solution.

In windows, you get a vague hex error code that leads you to a support page where the error could be caused by one of a dozen reasons.

And on top of that, MS is constantly hostile to any user who just wants a basic OS to use their computer with.


So couple of issues there. Never upgrade windows. Fresh install only. Never had a good day upgrading it.

Secondly, there isn't always a solution in Linux. I've got one now where something is utterly broken and it's 5 layers of maintainers down and no one gives a shit.


Windows 11 has begun making it a lot harder to install with local logins. They just disabled the typical method for enabling local only.

I want to upgrade in order to retain that local account.


Install LTSC: https://massgrave.dev/windows_ltsc_links

Then use massgrave hwid activator.


My experience is the opposite. Epgot a hold of a bunch of floppies in 1991. Dual booted so I could play Diablo. Some time around '98/99 got tired of dual booting.

Steam getting proton was a godsend, all those years of games became playable so now I have a huge back catalog.


> Plus many years of server-side experience says to me "I should probably just use FreeBSD" in that space.

Not a bad idea. This is exactly what I do on my daily driver.


Meanwhile people who actually get stuff done all use linux :D

That sounds amazing, well not for you but for your kid :) It has been very valuable for me that I messed around windows and linux as a kid

Put him on debian stable with xfce and no sudo if he is such a bother. Sounds to me this is a people problem, not Linux problem. Do you miss windows or do you miss not having to spend time with kid on things that bother you?

Not the OP, but hibernate support is one thing that sent me back to windows on my Framework laptop.

In windows, I can just shut the lid and not worry about it, because it will sleep first, and eventually hibernate. Ubuntu would just sleep until the battery dies.

I found instructions for enabling hibernate in Ubuntu, and they did make it show up in the power menu, but it didn't seem to work. (Which is presumably why it was hidden to begin with.)

I also tried NixOS, but I couldn't even get it to boot the installer.


> In windows, I can just shut the lid and not worry about it, because it will sleep first, and eventually hibernate. Ubuntu would just sleep until the battery dies.

It's really funny because this is one of the things I absolutely do not like about Windows. I absolutely hate it that I put the computer to sleep and when I come back the next day it has hibernated. That said, I agree that hibernation has always been finicky on Linux, however, I would say Ubuntu is not the best distro for this use case. I have been using Fedora and they even publish official guides for it[0] that's how seriously they take it.

0: https://fedoramagazine.org/update-on-hibernation-in-fedora-w...


Just have it suspend to disk and shutdown on lid close.

I do this for arch Linux on my framework and it's fine. Startup time is under ten seconds, essentially zero battery drain, right back in your session with all apps/docs open.

Hibernate is definitely better but still finicky even on Mac/Windows, machines can and do fry themselves, or require a hard reset if you unplug a device at the wrong time. Or unexpectedly continue draining the battery.

It's a terrible, funky, poorly documented, exception filled world down in the low power states for hardware.


Never had an issue on Mac, never got it to work properly on a single of my many linux laptops

Try unplugging an external USB-C monitor while it's transitioning between states.

It won't come back up ok, as recently as 6 weeks ago.

Or the rampant reports of things like this: https://discussions.apple.com/thread/255642823?sortBy=rank


the number 1 linux user accessory is the word "just"

A computer is a tool - learn to use it like a tool. People spend far more time learning to drive than they do learning to use computers these days, but I'd wager the computer matters more.

But too many companies have discovered that a docile "user" who's fed constant dopamine hits and has no actionable way to use a device other than open their wallet and fork over cash to watch more cats dance, or shop on more stores is exactly what they want.

Why don't you just click here and pay for Onedrive. Or just click there and accept Apple's new ridiculous terms.

If you just want to watch cats dance... you do you. I'll just keep doing me over here.


Anti-cheats are not really compatible on Linux IIRC. Maybe there have been improvements on this front but I think this was the main issue for a lot of gamers. This and there were cases when they were getting banned for playing through Wine.

I once tried to set up a GPU passthrough setup to a Windows VM to play WoW but there were a ton of report that Blizzard just banned players for using QEMU VMs because they were marked as cheaters.


Could some game programmer say if it's true that kernel level anti cheat is just bad programming?

Primagean recently said that in a video commenting PewDiePie's "I switched to Linux" video. While he's apparently a good programmer (he worked at Netflix), he uses Vim, so I don't trust him. Edit: part about vim is an edgy joke.


Weird reason not to trust someone, and I think prime is a decent programmer.

I work in AAA gamedev and have deployed kernel level anti-cheats before, and I’m aware how unpopular they are; so, sorry for that… you would also accuse us of “bad programming” if there was an overabundance of cheaters that went undetected and/or uncorrected.

The answer is unfortunately complicated, the kernel level anti-cheats themselves aren’t necessarily poorly written, but what they are trying to do is poorly defined, so theres a temptation to put most of the logic into userland code and then share information with the kernel component- but then it’s dangerous for the same reason that crowdstrike was.

Not doing endpoint detection is also a problem because some amount of client trust is necessary for a good experience with low input latency. You get about 8ms in most cases to make a decision about what you will display to the user, that’s not enough time to round-trip to a server about if what is happening is ok or not. Movement in particular will feel extremely sluggish.

So, its a combination of kernel level code being harder in general (malloc, file access etc; are things the kernel gives you in user land after all), the problem space being relatively undefined (find errant software packages and memory manipulation), not being able to break out of the kernel level environment for an easier programming and iteration experience and trying to not affect performance.

Lots of people think they can do it better, I’m happy to hire anyone who actually thinks they have a clue, it’s a really hard problem honestly and the whole gamedev industry is itching for something better: even us gamedevs don’t like kernel level anti-cheat, it makes debugging harder for ourselves too and introduces hard to reproduce bugs.

PS; sorry if I’m not being eloquent, I am on vacation and typing from my phone.


This is well written and quite easy to understand. (I only have cursory knowledge of programming.)

However, what if Primeagen meant that HAVING to IMPLEMENT kernel level anti cheat is a symptom of bad programming, and not the anti cheat per se? (that is, with good enough programming, it could somehow be avoided).

And kudos to you. I appreciate people in game dev, they can get a lot done in short time. I haven't played mmo fps since battlefield 3, and it wasn't that bad then. But I've heard that without kernel level they would be unplayable.

Thank you for your time!


The reason why you need kernel-level anti-cheat for it to be meaningful is because it necessarily needs to sit on a level lower than cheats themselves; and cheats can be very advanced these days.

Long term I'm kinda hopeful that this is something that will be mitigated through AI-based approaches working to detect the resulting patterns rather than trying to detect the cheat code itself. But this requires sufficiently advanced models running very fast locally, and we're still far from that.


The cheaters are very good these days. They will happily sit in the kernel space to hide from the game if needed, because people pay a lot of money to cheat developers to be able to cheat.

> so theres a temptation to put most of the logic into userland code and then share information with the kernel component- but then it’s dangerous for the same reason that crowdstrike was.

I don't understand, how could crowdstrike have avoided their issues by putting more code in the kernel? Or am I misreading your statement?


The crash was caused by a data parsing issue for the code in the kernel (the heuristics database).

If they had not tried to parse data inside the kernel it would not have been an issue.


Good faith question: why is the server not the source of truth? With local interpolation for things like character movement, reconciled in heartbeat updates?

FD: Still on a phone on vacation. :)

The reason is the round trip time mainly.

Server corrections will feel like “floaty” or “banding” behaviour, we used to do that and people get upset because it “feels” wrong.


Not all cheating sends "bad data" to the server. Cheats like wallhacks or aimbots are purely clientside and can't be detected on the server

The opposite is true. He uses vim therefore I trust him.

The two most widely used anti cheat application battle eye and easy anti cheat both natively support linux but game developer have to check a box to enable it.

About 40% of games that use anti cheat currently work on linux. Getting banned for using wine is very rare because anti cheat that don't support linux would complain about not running an prevent you from even joining a game to get banned.

https://areweanticheatyet.com/


Frontend stability.

I've been through enough KDE, QT, and Gnome API changes. It's just not where I want to burn my limited time.

My first GDI programs still compile.


Compile sure, but they really had some bad ideas in those days. Remember MDI Multiple Document Interface? Having Windows within windows. It was a terrible idea.

OLE? Sure, let every application talk to the DLL components of every other application! What could go wrong? Data wants to be free right? Spread the love.

Making the desktop into a live webpage? And of course let any webpage happily load whatever binaries it wants from the internet. Super handy stuff. For some people more handy than others (really how this did not cause a mega Wannacry-event back in the day I don't understand)

There is a reason this stuff is legacy. The only reason it still compiles is because some companies have spent millions on custom developments 20 years ago that nobody remembers how it still works. Not because you should still be using it :)


You and I must have had very different experiences in those times.

> Remember MDI Multiple Document Interface? Having Windows within windows. It was a terrible idea.

It was definitely overused - nobody needs Microsoft word to be a window manager for every doc file. But it ends up growing into something really nice where you get to build out the sub windows of your IDE wherever you want them.

> OLE? Sure, let every application talk to the DLL components of every other application! What could go wrong? Data wants to be free right? Spread the love.

This was also incredible. I built one of the first tabbed web browsers by embedding instances of the IE 4 DLL into my tabs. OLE and OCX extended object-oriented programming across program and language boundaries.

> Making the desktop into a live webpage? And of course let any webpage happily load whatever binaries it wants from the internet. Super handy stuff. For some people more handy than others (really how this did not cause a mega Wannacry-event back in the day I don't understand)

This was terrific for pranks. Yeah what were they thinking on this one.

> There is a reason this stuff is legacy. The only reason it still compiles is because some companies have spent millions on custom developments 20 years ago that nobody remembers how it still works. Not because you should still be using it :)

Maybe as I've aged the novelty of building programs has worn off. For some things I don't want to have to port or even recompile it. I just want it to run. If Win32 is the only stable Linux ABI, GDI is the only stable Linux GUI toolkit.


For me its the UX. It just feels off, amaturish, messy. I can't really put my finger on it. I think the frankly crap fonts a lot of distro's choose to have as default dont help. And then the very "designed by a developer" feel to a lot of the UI.

And I know someones franticly typing away right now - yes, I am fully aware you can customise things, but out of the box it should be pretty damn well polished so that you don't need to.

Ubuntu's probably got the closest but it still just doesn't quite feel like they've nailed the experience.


One of the things I wonder about recently is whether there's too many distros, which is dividing effort and there's less drive to find consensus on certain issues when everyone has the freedom to do things their own way and experiment to explore their niche. That freedom is the point of free software to a large extent, but there's costs to it. It also divides the userbase so when something doesn't work you may need to dive deeper into the details than you'd like to see if there's anything particular about your species of the linux animal kingdom.

It'd be interesting if there was a "Ubuntu v2" type effort, over 20 years later. Before ubuntu it's not as though desktop linux was an impossible dream or there was a lack of distros, but Canonical cleaned up a lot of rough edges to the extent it became a lingua franca. It's to the extent you can rely on ubuntu being in instructions for linux software, for example if there's any differences to required package names it'll be the ubuntu names over debian's.


Yes, exactly. To be fair, projects like GNOME and distros like Ubuntu do publish human interface guidelines, but I dont think there is any enforcement and so jankiness creeps in. I suppose it's no different from Windows 11 still having programs that have UIs dating from Win2K. But at least the icons and colors and window chrome are professional looking.

I am extremely experienced with Linux. Every single one of my servers is running RHEL/Rocky. I daily drove Linux back in the early 2000s. I have spent more time in sysctl.conf testing tunables than I have spent with my family, so it seems.

1. My capture card doesn't work reliably in any distro. I'm not a gamer so I can't use a cheap and ubiquitous USB V4L card, I capture retro computing screens at weird resolutions and refresh rates so I have to use an enterprise-grade solution that can handle strange things like sync-on-green from 13w3 connectors and extremely rare outputs from UNIX workstations from the 80s and 90s.

2. If someone sends me a link on my phone it is difficult to copy and paste it to a Linux system.

3. Battery life on laptops, despite decades of improvements, is atrocious on Linux. If my laptop gets twelve hours of real-world use under OS A and six hours under OS B, I've got to use OS A.

4. All of my screens are 4K. Today, in 2025, a full decade after 4K became standard, the way various DE/WMs handle scaling is embarrassing.

5. Nvidia. Yeah, it "works" for about 2-3 kernel upgrades then you're greeted with a blinking cursor upon boot because of DKMS or some random reason like patching the system and not rebooting for a couple of days and then patching again.

6. There's little consistency across devices. When I log in to system A I want every single icon, file, and application to be the same as system B. iCloud/Onedrive do this. You can do this on Linux while on a LAN with remote home folders. I don't work exclusively on a LAN. Or I can set up puppet/ansible for my non-infrastructure systems and that makes me throw up in my mouth.

Almost none of that is the fault of the kernel. That's irrelevant.


Regarding 3. Battery life - I’ve had a ThinkPad Nano for several years that, on Windows 11 would get roughly 4-6 hours battery, and this was optimized (very few running apps, no junk on startup, power saving settings on, etc). I switched it to Ubuntu (I was surprised that everything worked out of the box too, all of the hot keys and everything), and it will get about 8-10 hours doing the same tasks (primarily Chrome). So there is something to be said about Linux in general just being so much more “light weight” so to speak vs windows, which has become such a bloated mess. But the main issue I had was your point 4, since the thinkpads screen is 2K, everything was either too small (with no scaling) or too big (with scaling on).

Fully agree that Desktop Linux isn’t nearly there. If I need a Linux DE for something, I spin up a Debian VM with XFCE, because that seems to suck the least, and I already have prebaked Debian images.

For headless servers, I want nothing else. For a daily driver, as much as it pains me, nothing comes close to the Apple ecosystem. Apple Silicon is years ahead of everyone, and their interop with (admittedly only their own) other hardware is incredible. Universal Clipboard is magic. The fact that I can do nothing more than open an AirPod case and my phone registers it is magic. Finally, the fact that MacOS is *nix is absolutely icing on the cake.


to me it's such a crime that for all the crowing in the world about the need for operational sovergnity, MacOs is the only OS that can offer such a high standard of operation. I've seen some countries try their hand at modifying android to compete but the lack of a competitive monolith to them has allowed them to become complacent

XFCE? How is this better than KDE (which uses Qt as it's based GUI libs)?

I'll echo archvile here, in that I get excellent battery life running Linux. I've been getting 10-12 hours of battery life from the assortment of Asus and Thinkpad laptops I've had the past 15 years.

To give a very concrete example, I have two identical Thinkpad T14 at work, one running Linux (Debian Bookworm with KDE) and one running Windows 11. When doing normal office work, the Linux laptop easily lasts a whole workday with >20% battery left at the end. The Windows laptop runs out of battery in less than 2 hours.


    > Today, in 2025, a full decade after 4K became standard, the way various DE/WMs handle scaling is embarrassing.
Generally, I agree, but Qt (KDE) is the standout to me, primarily because it is "commercial first, and open source second" in my mind. Do you have HiDPI scaling issues with Qt apps?

Use kdeconnect. It is a universal app and works seamlessly

Or Signal app.. it works well for this sort of thing!

what?

Regarding 6 You can do this on Linux while on a LAN...

Perhaps Syncthing would partially cover this? Not the applications, but the files ....


For the applications, you can use NixOS and git

Same here. Linux has been my daily driver for over twenty years now, at home and at work. (Not a gamer though.)

Not OP but for me it's a solid remote desktop alternative that can compete with Windows' remote desktop experience. There's been some movement there, so perhaps in 5 years time.

Also I really dislike how out of memory conditions just causes everything to grind to a halt for 5 minutes before something, typically Firefox, crashes. On Windows at least just Firefox gets very slow, but usually I can just nuke the process that eats too much memory. Not so on Linux as the whole desktop becomes unresponsive.

And every now and then I still need to fiddle with some config files or whatnot. Not game breaking but annoying.


Not OP, but my experience with Linux is that seemingly absurd usability issues just keep piling up the more you use it and at some you just kind of give up and abandon any expectation of even a decent level of common sense from whoever is developing the system.

I've listed some of which I encountered on Mint here https://www.virtualcuriosities.com/folders/273/usability-iss... Among them: AppImages just don't run unless you know how to make them run. This could be fixed with literally a single dialog box. There is no way to install fonts by default other than knowing where to put them and knowing how to get there. Every app that uses Alt+Click, e.g. for picking a color, won't work because that's bound by default by the DE.

These issues may sound small at first but think of it this way: did nobody making this OS think about how users were going to install fonts? Or ever used an application that used the Alt key? Or did they just assume everyone would know what to do when they download an appimage and double click on it and nothing happens?

And you can just feel that the whole thing is going to be like this. Every single time in the future you want to do something that isn't very extremely obvious, you'll find a hurdle.

I even had issues configuring my clock because somebody thought it was a good idea to just tell users to use a strftime code to format the taskbar clock. I actually had to type "%Y-%m-%d%n%H:%M" to get it to look the way I want. And this isn't an advanced setting. This is right clicking on the clock and clicking "Configure." When I realized what to do I actually laughed out loud because it felt like a joke. Fellas, only programmers know these codes. Make some GUIs for the normal people.


Not to argue with you, but is that Linux Mint specifically? I never used it, and its DE looked very unprofessional to my liking. Personally, I prefer modern Gnome, but I also like KDE. Everything else looks very unfriendly to an average user, I won’t ever install it. I’d go Gnome for Mac users and KDE for Windows refugees.

This is why Linux will always be a terrible OS. Every time someone says "Linux is bad because XYZ" someone will tell you "actually that's your distro, if you used distro ABC you wouldn't have that problem." But ABC has a different set of problems, which if you wasted 2 months to realize them and start complaining about, someone would just direct you to distro JKL.

The fragmentation of Linux leads to a ping-pong of responsibilities. Linux can never be a bad OS because it isn't an OS.

On Windows, if the file manager is bad, that's Microsoft's fault. Period. Nobody tries to say "actually..." it's Microsoft's fault. Period. The same goes for the taskbar, for the control panel, for MS Paint, for even Microsoft Office. If Microsoft will fix it or make it worse depends on them, but nobody denies who is to blame and everyone know where the blame lies. Meanwhile I don't even know if the basic utilities that my distro distributes are under the responsibility of Mint's team or if they will just direct me to some random open source project's issue tracker if I start complaining about Celluloid or the "Drawing" app.

You can't talk about Linux thinking only about the good parts, or you aren't inviting people to try Linux, you're inviting them to try your distro. "Linux" means the whole ecosystem, including all of its problems.


Au contraire, I would say that Mint is probably the closest to stock Win11/macOS experience right now. Gnome, on the other hand, looks utterly alien and non-discoverable

What you mean by Win 11/macOS? I see them as completely different from each other. Or are there some overlaps?

Personally, I like modern Gnome: https://news.ycombinator.com/item?id=43859753


They have been converging for some time now. The taskbar in Win11 is very much a macOS Dock wannabe, for example.

Personally, I find modern Gnome insufferable because it is non-customizable to the extent that even macOS only dreams of, and it doubles down on the modern trend of hiding important UI behind poorly discoverable gestures (active corners etc). Except their take on it is even worse in general for mouse users because of how much more "legwork" it adds - e.g. in a default Gnome setup on Fedora, you need to move mouse cursor in the top left corner for the dock to show up (so that you can switch apps or launch a new one)... but then it shows on the bottom of the screen, so now you need to move the cursor all the way there across the screen.

But that's all subjective and not really my point. The point, rather, is that Gnome looks and behaves very different from Win11 and macOS both, in ways that don't make it easy for users to migrate (and in fact they specifically state that their UX design does not consider that a goal).


I never thought of this, the excessive mouse movements, like top left to bottom. What I thought of is that it killed this silly minimise and full screen option, even when everyone and their granny trained for those three buttons.

I like they ditched all the unnecessary things from the settings. I think all the pro-level settings must be dealt with via terminal. That way, it’s both of two worlds. Me, I don’t mind it. But if I manage the computer for someone, I want them to have only the minimum things, so they won’t be overwhelmed. That’s very wise, and unfortunately all these Win3.1 geeks are complaining it’s bad. Yeah, okay, keep using your favourite XFCE then, or whatever.

I’d install Gnome for elderly, even if they have some previous Windows experience. Because they can afford to just ignore it. My mum, she has no computer, and last time she used Windows was like, idk, a decade ago. Explaining Gnome to her is easy: here is the Windows (or CMD) button, you press it once, you have this iPad like interface. Here is the Dock, you have all the necessary apps in there. More of them if you press that Windows button one more time. But actually you don’t need it 99% of the time, so you can survive with top left corner pressed once. Two times press is for me. Closing the app is that X button. What else does she need?

Now, try to explain the [any other DE basically] to elders the same way. Considering most of these people have iPads. And if they’re not, well, I don’t really get why, they should. My guess is that their interface appeal to that audience. And to me that’s a great thing, that’s most of non-tech people now.

However, I’m (being an obviously pro user) able to use the default Gnome productively. Almost as productively as I use SwayWM. To me, that’s very impressive.


Microsoft Office



Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: