Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There is a reason why 1) people whose main environment is Linux feel (correctly) that these problems have been solved a long time ago, and 2) people whose main environment is not Linux but who try Linux occasionally feel (correctly) that these problems still occasionally crop up.

People whose main environment is Linux intentionally buy hardware that works flawlessly with Linux.

People who try Linux occasionally do it on whatever hardware they have, which still almost always works with Linux, but there are occasional issues with sketchy Windows-only hardware or insufficiently tested firmware or flaky wifi cards, and that is enough for there to be valid anecdotes in any given comments section with several people saying they tried it and it isn't perfect. Because "perfect" is a very high bar.



>People whose main environment is Linux intentionally buy hardware that works flawlessly with Linux.

Hm, recently I bought a random "gamer PC" for the beefier GPU (mainly to experiment with local LLMs), installed Linux on it, and everything just worked out of the box. I remember having tons of problems back in 2009 when I first tried Ubuntu, though. I have dual boot, just today I ran a few benchmarks with Qwen3. On Windows, token generation is 15% slower. Whenever I have to boot into Windows (mainly to let the kid play Roblox), everything feels about 30% slower and clunkier.

At work, we use Linux too - Dell laptops. The main irritating problem has been that on Linux, Dell's Dock Stations are often buggy with dual monitors (when switching, the screen will just freeze). The rest works flawlessly for me. It wasn't that long ago when my Windows (before I migrated to Linux) had BSODs every other day...


My random "gamer PC" won't even boot into any Linux live CD, so I can't install it at all.

Anecdotes are like that.


> people whose main environment is Linux feel (correctly) that these problems have been solved a long time ago

There is also the quiet part to this. People who religiously use Linux and think that it is the best OS that can ever be, don't realize how many little optimizations go into a consumer OS. They use outdated hardware. They use the lower end models of the peripherals (people still recommend 96 DPI screens just for this). They use limited capabilities of that hardware. They don't rely on deeply interactive user interfaces.


I own a 2011 thinkpad, a 2014 i7 desktop and a "brand new" 2024 zen5 desktop. They all work wonderfully and all functionality I paid for is working. I haven't had a single problem with the newest machine since I bought it other than doing the rigmarole to get accelerated video encoder/decoder to work on Fedora. Sucks but I can't complain.

The older machines I've owned since around 2014 and I remember the hardware support was fairly competent but far from perfect and graphics and multimidia performance was mediocre at best and ZERO support for accelerated video encode/decoder. Fast forward to around the last year or two and linux on both of these machines is screaming fast (within those machines capabilities...), graphics and multimidia is as good as you could get on windows (thanks wayland and pipewire!) and acc. video decode/encode works great (still have to do the rigmarole in fedora, but it's ootb in manjaro).

Both the 2014 machine and the 2025 sport a 4k display @120hz (no frame drops!) with no issues using 200% scaling for hi-dpi usage. Pretty much all of the apps are hi-dpi aware, with the exception of a few running on WINE which until a few months wasn't HI-DPI aware. (this feature is experimental and among many other improvements in WINE may take another year to mature and be 100% stable)


200% is just rendering the same pixels and them drawing them 4 times and driving a single monitor at the single resolution is easy stuff. Would your HiDPI system with one monitor at 125%, one at 100% and another at 150% scaling? This is when the font rendering gets fucked up and your Hi-DPI native toolkits start blurring icons. That's my setup. Windows is perfectly capable to make this work. GTK wasn't able to do fractional scaling until recently and Qt has 100s of papercuts.

I got a Thinkpad to just run this setup under Linux 2020. AMD didn't solve the problem in their driver until 2022 when I was able to drive all of them at 60 Hz.


No, 200% is rendering 4 pixels with "features" 2x larger in each axis. You may get 200% scaling as you said with some legacy apps that give zero fucks about dpi scaling but are still scaled trough some mechanism to properly match other apps.

Fractional scaling has been a problem across all platforms, but I agree Linux has taken its time to get it right and still have some gotchas. You should try to avoid it in any platform honestly, you can get sometimes get blurry apps even in Windows. AFAIK KDE is the first to get it right in this complex situations where you mix multiple monitors with different fractional scaling ratios and have legacy apps to boot. GNOME has had experimental fractional scaling for a while but it's still hidden behind a flag.

It also helps to not have nVidia trash on your old (and sometimes even new) computers if you want longevity. My old machines have intel and AMD graphics with full support from current kernel and mesa.


Linux is basically everyone's go to for older devices. Windows 10 will run like shit on a 10 year old laptop with 4GB RAM but latest Ubuntu is nice and snappy.


I have a 13 year old laptop that runs Windows 10. I cannot run Linux because neither nouveau nor Nvidia drivers support its GPU. It has 8 GiBs of RAM and it works perfectly for light browsing and document editing.


What GPU?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: