Hacker News new | past | comments | ask | show | jobs | submit login

End-user Linux has definitely improved in the last few years.

It offers a lot of attractive features for what I imagine to be the typical HN demographic.

That being said, it's still got rough spots that OSX doesn't. It works great when it works, but when it doesn't....

Font rendering, display/compositor fragmentation etc...

Inb4 the anecdotal "well it works for me I just had to download the xf86 font library and compile with a legacy glibc version..." crew comes in with a thousand and one rebuttals. Problems like that are still a suboptimal user experience, no matter how you slice it.

I'd definitely consider a Linux daily driver for some of my work, but there are things that are just going to be less painful on Apple.




Maybe I'm missing something, but font rendering seems fine on Ubuntu without any after install steps.

I've been using Ubuntu exclusively for well over a decade for everyday work, only occasionally going into a Windows VM, and it's really been perfectly fine, great even, as a developer. I bought a X1 Yoga the month after it was released and Ubuntu installed perfectly on it, the only thing that didn't work out of the box is the fingerprint reader.

Until a year ago, battery life wasn't as good as Windows/Mac, but it's very good with the latest versions.

Proprietary software that some people need to run, now that's another issue, but for most development tasks it's fantastically manageable and accessible, a real pushback against closed systems.


I was using Ubuntu exclusively for around a decade too. It's a great development environment. But about a year ago I needed a new laptop and with the one I purchased, some things didn't only require extra steps to get working, some of them were no fix issues until some indeterminate date in the future.

You mention the fingerprint reader not working out of the box - with my laptop, Fingerprint GUI was basically waiting on one of its dependencies to somehow figure out how to integrate my fingerprint scanner. Things like my active-stylus capable touchscreen weren't supported, and there were no applications to really utilize it even if it was.

I switched to Windows as my primary OS when I realized Windows PowerShell had basically become on par with Linux in almost every degree and that VS Code was as cushy as I could hope for in a development environment. The only thing I've found that isn't supported out of the box is Redis, but I downloaded a ridiculously lightweight version of Ubuntu from the Microsoft store (we're talking <1 MB memory footprint) with one click and was then good to go.

The other thing that really impressed me was all the easy to use tuning software. With ThrottleStop I was able to easily under-volt my processor to completely eliminate things like thermal throttling and improve performance all while greatly improving my battery life. Nvidia support is also way better so I can turn off my graphics card for anything but games - and then there's MSI Afterburner to under-volt my GPU when I am using it.

And yeah, not only do my fingerprint scanner and stylus work on Windows, but Windows has Windows Ink built in so I can easily take screenshots of whatever I'm doing with Snip & Sketch and annotate them with a pen in an instant, and I can use Sketchpad like an on the fly whiteboard when I need to do some math.

Plus, it has art programs like Krita that basically turn my laptop into an iPad Pro when I feel like getting artistic.

And with programs like Enpass, I can use my fingerprint with Windows Hello in place of my master password for stuff like logins and credit card information, which is a lot more secure for someone like me that does a lot of my work from coffee shops.

I still love Ubuntu, but all the offerings of Windows 10 has kind of made me a Windows fanboy and even makes MacOS seem like a decisive downgrade.


I know security is hard and i like to skirt it sometimes as well, but i still think it should be pointed out that a fingerprint should not be used as a password for security critical data. Its at best a username, as you can't change it and a motivated person can trivially steal it.

Not trying to discourage you from using it like that. Its perfectly fine as long as you realize that the fingerprint is only secure against random people on the street or just not very competent attackers... which is fine and is probably enough for most scenarios!

but now on the topic itself... the windows subsystem for linux is perfectly fine for a lot of things, there are quite a few issues however. All files accessible from windows will have 777 for example, there are a few applications that have issues with that. daemons exit as soon as the last terminal closes is another thing many people have to stumble upon.

and ymmv on the issues you mentioned. everything you mentioned is completely uninteresting to me, personally. a good window manager such as i3wm offsets pretty much every shiny-ness windows 10 has for any development purposes.

i do use windows for anything else though. (and am sadly forced to use it at work as well)


It's sort of an after-install thing that takes a minute, but I have some keyboard shortcuts that allow me to do things like jump into a sketching program with the current clipboard contents. And Krita itself (and a few other good apps) are available on Linux.

Not sure how Throttlestop compares to the latest Linux options. I've got many containers going, three instances of vscode, a zillion tabs, performance is not an issue.

But, I realize not everyone cares about a free and transparent world, even when it's more and less as good. It must at least encourages companies like Microsoft to keep opening up and getting better.


> and it's really been perfectly fine, great even, as a developer

Here is the point. It absolutely is fine as a developer. But my parents would never be able to get accustomed to Ubuntu or any other distro. It was hard enough to make them use email.

The truth is that the vast majority of people just want things to work. Like turning on a TV without any setup. Hell, people pay electronics stores 100€ to plugin a cable and run the "find channels" function.

OECD studies have shown that more people than one thinks are incapable of using search in email [1].

[1] https://www.nngroup.com/articles/computer-skill-levels/


But, I wasn't talking about your parents.


That's true: some things are easier on Mac. In particular, I'd say that using software that hasn't been packaged for your distribution is much easier on Mac.

However, your example of fonts is definitely not one of those areas, anymore. Font rendering on Linux is as advanced and capable as any other OS including in the areas of kerning and hinting. It should just work without any user intervention and look great.

I agree with you that compositing/window manager fragmentation is a problem. And this article is a perfect example of that. The author may think that they're happy using i3 with Firefox and st at the moment, but the desktop computing environment has gotten so complex and the expectations of users who interact with desktop applications so rich, that a small hobby project DE/WM cannot fully satisfy all of those use cases over the long term. The only two desktop environment projects that have enough resources to meet the needs of users behind them are KDE and GNOME. And we shouldn't be telling new Linux users to try anything but those two.

FWIW, I've worked at Google for 7 years where we use Linux on our workstations. Many engineers fiddle with various "hacker" window managers like i3, Sway, Awesome, or fvwm. They almost universally give up and switch to GNOME(/Cinnamon) or KDE: it's just too fiddly/not complete and they'd rather use their brain power for solving real problems. The authors example of messing around with dmenu because "unix philosophy" is an example of this kind of waste of time that people eventually get tired of because they have better things to spend their time on.


Personally, I gave up on Gnome and run LXDE a bit like i3 - autopositionning the windows with Fx keys mapped to a given set of coordinates, so F10 will make a window use a square on the left covering 80% of the screen, F11 will move the window to the remaining 20% on the right, thus hiding conky - etc. The only advantage I find in LXDE is that it has a titlebar I can 'reveal' if I wish to mess around and fiddle with the window position (like when tracking the arping replies in a remote lan and using a bit of scripting to see the evolution of the metric as I fiddle with things)

I like my desktop lean and mean. I do not want distractions. When I am dealing with a remote system crashing under load, the last thing I want is my desktop or my shortcuts to behave in weird ways. Things must always work, in a consistent way. Funny thing is I can only get that in Linux... and in Windows 10.

Customization is a feature, just not everyone needs that feature.

So I disagree with your assessment, as some users will find Gnome or KDE too distracting.


Some people try that, too (and XFCE). Admittedly, they're better than a tiling window manager, but LXDE's last stable release was 2016 and it doesn't have a compositor so all of that graphics hardware in your computer specialized to prevent X11 DAMAGE events from forcing applications redraws and for saving power is going unused.


Does using a compositor really _save_ power in practice? GNOME's one certainly doesn't, in my experience. It would spin my fans up after a few moments of dragging a window around. Of course it has no configuration options so it is not possible to fix it. kwin is almost as bad last time I checked on my hardware.


I agree ; I'm considering moving to wayland but still now LXDE does everything I need well enough to not have bothered for the last few years


I'm curious what you find distracting in Gnome? Unless you're running in Classic mode the newer versions got rid of everything except the top menu bar and that is less busy than it used to be. The default launcher works very well using the keyboard only, I actually quite like it (though there are definitely other things I don't like about the Gnome defaults).


What I find distracting: the menu bar indeed, the file manager on the desktop, the title bar of the windows, the buttons on the titlebars of the windows

That doesn't leave much of gnome.


> The only advantage I find in LXDE is that it has a titlebar I can 'reveal' if I wish to mess around and fiddle with the window position

Unless I'm misunderstanding, you can use Alt + left click to move any window around, and Alt + right click to resize. No need to reveal the title bar except perhaps to read the occasional title.


Correct, I really like alt left click to move windows but for some reason I prefer resizing with the title bar even if takes one more event (keypress) to make it show up, and one more to remove it when I'm done.


Off topic, but have you looked at LXQt? It is surprisingly nice, albeit with slightly more memory usage.

I never thought I could move away from LXDE but I think I'm going to soon.


Interesting - I will consider it, thanks a lot for the suggestion! Memory use is a concern, but so are wakeups in a tickles kernel on a laptop.

At the moment I'm considering Sway mostly because of the wide use and community, but it's a "long term" project this month or the next :-)


> I agree with you that compositing/window manager fragmentation is a problem. And this article is a perfect example of that. The author may think that they're happy using i3 with Firefox and st at the moment, but the desktop computing environment has gotten so complex and the expectations of users who interact with desktop applications so rich, that a small hobby project DE/WM cannot fully satisfy all of those use cases over the long term.

Users have different needs and preferences, so I see the fragmentation as a positive thing because it gives people choice.

For me personally, xmonad, one of these hobby projects, has been perfectly sufficient for the last 10 years. I also didn't really touch my configuration much in the last ~8 years. More importantly though, I find it actually reduces my mental workload since I no longer have to handle window placement myself.


Definitely. I have had i3 on every computer I use for a couple of years now and I don't think I will ever go back to a conventional window manager.

No matter the thing I'm doing, every useful or efficient placing of windows is always no more than one or two keystrokes away.


Eh, I've been running xmonad within Gnome since (checks git blame) 2011 and I don't think I'm changing things anytime soon. Well, for the first 2 years I fiddled a lot and made sure I could just configure a computer the way I liked with a script. Occasionally I'll have an issue like Ubuntu switching to Gnome 3 but even that, the biggest disruption in years, just took an evening of fiddling to fix.


> Font rendering on Linux is as advanced and capable as any other OS including in the areas of kerning and hinting. It should just work without any user intervention and look great.

This isn't entirely true. AIUI, Apple enables LCD filtering and subpixel rendering by default, because it knows that you're using an LCD and what the subpixel order is. However, these are usually toggleable via the GUI, and even without them it usually still looks fine.


Since Mojave, MacOS nolonger does subpixel rendering.

https://mjtsai.com/blog/2018/07/13/macos-10-14-mojave-remove...


On non-retina displays, which... is nowhere near the normal use case for Macs at this point.


On all displays. It matters less on Retina displays.

External displays are nowhere near the normal use case?


No, my point is that this change only affects non-Retina displays. In my experience (worldwide, many companies, etc) anyone who uses a Mac with an external monitor generally doesn't settle for some POS. It's a high-end screen that matches the MacBook, hence why it's not that big of a deal.


The change affects all displays.

The highest supported resolution on a MacBook is still scaled down. Only a couple of expensive LG displays match the actual density as far as I know.

The MacBook Air only got a Retina display a few months ago. The low-end iMac is still 1080p.


People forget that subpixel rendering was a thing on CRT's before LCD's were around. Subpixel order is pretty universally standardized. Is there an EDID data element for pixel order?


I didn't know that! How does sub-pixel rendering work on CRT's, which (to my understanding) don't have a set matrix of pixels and subpixels?


It worked like shit, blurring perfectly fine text. At least the last time I had a CRT, which was very early 00's


I agreed with everything you said except this:

> but there are still things that just going to be less painful on Apple.

Of course OSX is more user friendly now, but Linux desktop has improved by leaps and bounds. 4 years ago you needed to be a developer or extremely savvy to run the average desktop distro, now I would say you just need to be tech sdavvy. I would argue that for the ho hum business cases (not extreme use case profiles like design and video production) that Ubuntu won’t cause any unnavigable issues.

I think in a few years you will see the Linux diaries continue in popularity, especially among developers. Laptops have become commodity items. There just isn’t that much that differentiates (for me at least) a MacBook from a good ThinkPad.


People have been saying that ever since, uhm, the first Mandrake release? It’s so much easier now, soon it will all be so smooth! except every year there is something else to rewrite/rejig.

As soon as auto-configuring XFree was kinda figured out, out goes XFree and in comes x.org. Xorg getting to the point where having 3d animations doesn’t require kernel-module-config expertise? Out goes Xorg, in comes Wayland. Gnome 2 worked out the kinks? Time for Unity! Kde 4 finally getting snappy? time to break it up! Init systems figured out? Systemd! ALSA getting adoption? Pulseaudio! Pulseaudio finally working? Let’s rip it out! And so on and so forth, in an endless churn.

Now, this sort of churn also happens in commercial alternatives; but stuff gets shipped when it’s 99.9% working, left running for years (or decades, if from Microsoft), then maybe gets rewritten with something that must be better (no regressions) or it won’t even ship. In the Linux world, it’s all just thrown over the wall; maybe you’ll be lucky and it will work on your machine, and maybe it won’t. By the time it gets fixed, it will be time to replace it. And so the experience is a perennial struggle against half-finished, unpolished software.


Yes the churn is an issue especially in desktop environments but relatively painless all things considered.

I don't think there has been a single inflection point for me there has been steady incremental improvement.

If you want to think about how far things have come I started using Linux in 2001 with Mandrake. Around 2.2 -> 2.4 kernel switch. So much has changed since the bad old days. I don't want to throw out a "back in my day we walked up hill in snow both ways" style rant but...

-All we had was EXT2 and we liked it...

-You had to manually configure modelines for your video card changing display resolution was more or less a crapshoot

-Apps would exclusively lock the sound card which typically meant the first thing you opened would be only thing capable of playing sound. But you could pipe things to /dev/dsp and have the speaker emit random beeps that was kind of cool

-window manager used to crash a lot and you'd lose all the title bars for all the windows this happened fairly often - cntrl alt backspace is still in my muscle memory years later.

-Printers were basically impossible to configure.

People complain about changes like ASLA, pulse audio etc but I think there is a lot of rose tinted glasses being applied to how things were before. Sure some things aren't perfect but neither was their predecessors and on the whole they fixed more things then they broke.


I am not saying things aren’t better than they were; but they are still not as good or polished as the commercial counterparts (who obviously didn’t stay still), and looking at the overall trend, they will likely never be - because of innate problems with the development model (release early and often, even if it’s basically unusable).

So I can agree that “Linux will get smoother”, because progress is more or less inevitable, but “linux will be as smooth as [Windows|MacOS]”, as upthread implied? Never going to happen.


The reality is, people get paid more to develop for the Mac world and put more effort in consequentially, whereas Linux is still mainly volunteer-driven.


> not extreme use case profiles like design and video production

There are a lot of use case profiles that are very ho hum for which Linux is a non-starter. I'm thinking specifically of basically any time you need to use Mac-only software. Likewise, I'm sure there's Linux-only software that would make OSX a non-starter.

IMHO, all of the major modern operating systems are good enough and have been for a while now. Pick the applications you want then find the OS that best supports those applications.

It would be nice if the Purism guys would open some retail stores. Once you can walk in some place and get help it gets to be a lot easier to recommend those machines to less tech-savvy relatives.


Pretty much this, as UI focused developer I abandoned my Linux zealotry back to Windows.

Yes Hollywood is pretty much sold on Maya and Houdini on Linux, but they use their own in-house distributions and have no issues dealing with binary blobs for performance.

Meanwhile my Asus netbook sold with Linux still can't do video decoding on hardware nor OpenGL 4 support, in spite of DirectX 11 class hardware, because AMD decided to reboot their driver development.


I see lots of advice online to just use the kernel driver for AMD cards these days, but in my experience it is slow. On some of my machines it benchmarks slower than the Intel graphics. In other cases it is faster, but only by about 50% or so. It is disappointing. I still tend to prefer nVidia graphics when I'm building my own systems because even if the driver is a big binary blob it does work.


I don't care much about performance because I mostly play strategy games, my issue with the kernel driver is it's instability, many games will hard crash my system.


I would also say that Apple has plenty of rough spots that Linux doesn't as well.

Font rendering the Apple way is a style choice at best.

What does "display/compositor fragmentation" mean to end-user?


> Font rendering the Apple way is a style choice at best.

Given how much of a hacker's usage of a computer is working with text (reading pages, writing code and documentation, taking notes) it's a style choice that actually has a significant impact.

(By the way, macOS Mojave deprecates subpixel antialiasing — a poor decision when there are many non-hidpi displays still being used)


I've used both macbooks and linux and have never noticed a difference, so I have no clue what sort of impact I'm supposed to be noticing.


I remember a decade ago when there was more discussion about it the rought situation was: OS X fonts were "blurry" because the font rendering was optimized for stay true to the font shape. Microsoft was "crisp but kerning broken" because the font rendering optimized for pixel alignment, so less grey pixels but shapes moved slightly to fit pixel borders. Linux was somewhere in between however you configured it.

That was the time of sub-100dpi screens though. With 300dpi (Retina) screens these days it does not matter much anymore.


Yeah. I've got a 1440p 27" monitor that I connect via HDMI (USB-C adapter dock thingy) and the difference when I upgraded to Mojave was very stark. I actually prefer Windows rendering now to Mac.


> What does "display/compositor fragmentation" mean to end-user?

My 1Password FireFox extension (ubuntu 18.04 LTS) doesn't work on X unless I log in through the website but it works just fine on Wayland.


> My 1Password FireFox extension (ubuntu 18.04 LTS) doesn't work on X unless I log in through the website but it works just fine on Wayland.

Hey there! Beyer from 1Password here. It sounds like your issue might be related to my post here: https://goo.gl/cdhFbz

The good news is the underlying bug that was "breaking password fields in Firefox" was recently resolved. You can read about it here: https://goo.gl/uFv5rL

If you are still having an issue using 1Password X after updating gnome-shell, please reach out to us at support+x@1password.com, and we'd be happy to help!

Thanks for using 1Password!


^^ Things exactly like this.

The practical implications of what it means to the end user is that companies/entities that write any GUI-enabled software for Linux are forced to make decisions about what(if any) OS they'll support.

And just like that, we've waded into "cracking open the window manager" just to figure out what's going on.


What is your X session? Gnome/KDE or plain window manager? It might be, that your wm is not launching some service that wayland session is and 1password needs.


The fact that you're asking this question and suggesting that root cause is itself a significant part of the problem the OP was talking about.


Unfortunately, the openness allows for so much flexibility, that many people end up breaking it up. Invariably, they will blame the system and not their changes. If they used the defaults, like they do in other systems, it would not happen.


In this case though LastPass is broken in Firefox on Ubuntu by default.

Edit: akiselev said 1Password, not LastPass. My mistake. Interesting that I experienced this same issue with LastPass.



This was a clean install so Gnome shell - worked just fine on Ubuntu 17.10 and 18.10 with X and Wayland.

After I login through the 1password website, the extension window opens just fine so it might be some input security service, but how would a Firefox extension even have access to a system service like that except through Firefox's built in APIs?


I don't know how 1password works; I'm using keepassxc.

With keepassxc you have native application and extension, that communicate together via socket. The native application can use whatever native APIs it wants.

However, back to you. What's more weird, if it is pure Firefox Webextension, that Firefox (still by default) launches as X11 application under Wayland, so the extension should have no way to know the difference.


That being said, it's still got rough spots that OSX doesn't. It works great when it works, but when it doesn't....

Someone really needs to make Linux distros just work when hooking a laptop up to a projector. The fact that there's so much trouble that's so public and faced with so much concentrated embarrassment is a serious ongoing PR impediment to Linux.

Linux distros got printing licked across the board, so it can happen.

How about a Linux version of Airdrop?


Last time I hooked up my Linux laptop to a projector it fired up the screen just fine. I didn't even have to hit some magic key combo to turn it on.


Time before last when I went to a Golang meetup, the presenter's Linux laptop embarrassed him for a few minutes, and lots of devs with Macbooks made some linux jokes at his expense.

This was the story with printing in Linux as well, back in the day. It would work fine for some, and be a nightmare for others.


This is definitely a case where the details matter. The guy who runs Slackware on some no-name laptop is probably going to struggle more than the guy running Ubuntu on the Dell.


It was Bionic Beaver. He was a well heeled SV manager, and he had a fairly new laptop. Not sure if it was a Dell.

This also happened with printing. People would say it worked fine for them, then point out that scroungers on quirky old laptops were getting what they deserved. Really, the fault wasn't those quirky old laptops, but rather fragile and not so well standardized software.


This is rarely (but entertainingly when it happens) a thing - I once waltzed into a job interview and fought xrandr for like 5 minutes to demo my take-home project: turns out mirroring my 1080p built-in panel to a 4k HDMI screen with herbstluftwm/xrandr command line hackery _only_ was... non-trivial. Thus, for the whole 13 months I worked there, I got to share several laughs about how "it's the year of the Linux desktop, yeah?" Embarrassing at first, but life moves on.

(fwiw the sway 1.0 betas, which are more or less i3 ported to Wayland, handle this type of case beautifully, and unless my kernel does Strange Things and panics, I basically plug and play into whatever I want, much like one would expect from the GNOME/KDE experience)


I got to share several laughs about how "it's the year of the Linux desktop, yeah?"

The amount of time that joke's been around should be taken as a sign of a persistent condition.

unless my kernel does Strange Things and panics, I basically plug and play into whatever I want

The kernel doing strange things and panicking doing plug and play was never a thing on OS X when I started using it. Seems to me it's not been a thing for Windows since before Windows 7.

It's long been speculated that one problem with Linux is cultural. Are Linux desktops trapped by cultural expectations?


The only piece of this I'm going to engage on is the kernel panic bit, and only as far as a counter-anecdote (so take it as you will), but I had almost as many kernel panics on a 2016 Macbook Pro over a bit over a year of using it at a past job. In fact, almost the exact same scenarios that I'd see panics on Linux on an XPS 13, would panic the Macbook: plugging in external displays through USB-C/TB3 ports. Probably double or triple the frequency if those displays were connected to a full docking station - probably something odd in PCIe code, potentially even firmware level.

The only OS I haven't had kernel panics on in recent memory is, indeed, Windows starting with 7, but I have a long list of other reasons I can't/don't/won't use it as my daily driver.


I had almost as many kernel panics on a 2016 Macbook Pro over a bit over a year of using it

I'm still rocking my 2012 Macbook Pro. It's been solid.

The only OS I haven't had kernel panics on in recent memory is, indeed, Windows starting with 7, but I have a long list of other reasons I can't/don't/won't use it as my daily driver.

I can only stand to do my development on a windows machine by using a Linux VM.


This hasn't been an area where Linux has had problems for more than 5 years.


This hasn't been an area where Linux has had problems for more than 5 years.

Really? Because I definitely got the impression it was a widespread joke in that circle of devs. Also, when I saw it happen to the last poor sap, it happened on Bionic Beaver. Is this more of the Thermocline of Truth?


I think dual headed (integrated and discrete GPU) laptops using Wayland is still a problem for some people. I don't know why. I gave up on dual headed laptops a while ago and haven't had a problem with multiple displays using i915 graphics and Wayland.


In Linux's defense even the corporate issued Dell laptops at my work running Windows 10 struggle with projectors. I see it daily in meetings people have to randomly mash the function key until display gets mirrored to projector properly, need to plug / unplug projector repeatedly until it works. People hit fullscreen in powerpoint and projector display just disappears and have to go through plug in unplug again dance.

From everything I've seen a projector "just working" seems to be the exception rather than the rule.


From everything I've seen a projector "just working" seems to be the exception rather than the rule.

My experience with Macbooks has been excellent. Even Airplay over an Apple TV just does the right thing with presenter view in Keynote just working. The one time I had a rotten experience on a Macbook was when I had to use Zoom.


I honestly feel like I've seen this same comment on every Linux workstation post for years now.

And you know what? Its mostly true[1], but its also true for me in reverse.

Mac OS has definitely improved in the last few years... that being said, it's still got rough spots that Fedora 29 doesn't. For me, this would be just as true of a statement.

[1] Maybe not the part about "had to download[...]" which felt a bit biased.


>Font rendering, display/compositor fragmentation etc...

Everyone has their own view of rough spots but those aren't mine.

The rough spots I'm experiencing with Linux are very specific. Namely lack of support for some proprietary VPN solutions like Junos Pulse for example.

Also lack of native clients for software like Webex Teams, forcing you to use their web apps which use up so much resources that I'm convinced they've caused my laptop to stall a couple of times.

And of course, perhaps related to the issue above, anything relating to graphics does need work.

The major positive thing I can say about using Linux daily in work and personal life is that it works so well that when it fails you get very annoyed. That's a good sign. It means that it's rare enough to annoy me. If it was too common I wouldn't be surprised when it fails.

I also switched back from Mac to Linux, 2 years ago.

But I use vanilla Gnome 3 on Fedora. Before Mac I used tiling window managers but now I don't see the point. It's just so much configuration to handle which Gnome does without a single line of config or shell code.


Ubuntu and openSUSE are my main installs, and they usually feel fine, my only issue is always drivers, whether it's Wi-Fi or Graphics (worse one, since at least Wi-Fi I have ways to circumvent - buy a compatible wireless usb tiny adapter for example, or connect a router to the network and connect my device over LAN). If somebody could either make a properly open GPU or get AMD or Nvidia to release proper drivers that don't break between versions, I'd be happy. Nvidias drivers used to work on my one laptop, now they've broken something in the latest incarnation of the driver, that or it's just not compatible with Ubuntu's new X.org alternative, whatever the case, it seems too easily fragile. If I can't install it due to GPU issues, I move on if I can't find a compatible distro usually.

As for the font thing, maybe I'm not a fontphile or something, but they're usually fine for me.


Linux is just consistent. When it works, it works. When it doesn't - god help you.

There's no rhyme or reason.

Sometimes, my wifi is broken. On a different distro, it's not. Right now, under Ubuntu 18.10, my suspend from resume is broken - it wasn't on 18.04.

I have the official Linux laptop, basically - a Dell XPS 9350, all 100% Intel hardware, no binary blobs.


The 9370 is an even more fun can of worms - resuming from suspend will almost always work (I remember the 9350 having massive problems with this, it's not just you - I ended up selling that rig to a friend), but there's a decent chance I'll kernel panic upon plugging in my Type-C dock after said resume.

The Killer Wi-Fi card has none of the connectivity problems I read about almost everywhere for this model.... but has all the throughput problems, running at about 15% of the throughput/bandwidth of my Chromebook (an Intel AC card) sitting right next to it as of the last time I tested.

I'd definitely not call Linux consistent, but it's better than it was when I started using it back in 2006-2007ish, and I wouldn't trade it for any other setup (and it's not for a lack of trying: my primary work machine for 2.5ish of the last 6 years was a Macbook Pro of some sort for one reason or another)


I've also found that on my XPS 13 with 18.10. Was fine previously on 16.04. Now I have to open and shut the lid a few times to get it to stop flickering. It's annoying and would otherwise be perfect.


Ah yes that. That’s an (old) bug about how logind interacts with gdm. A quick workaround is ctrl-alt-fn-F1 to force the screen to lock itself. Makes the flicker go away.


That trick worked perfectly. Thanks very much.


For me, Linux has improved a lot because Windows has gotten a lot worst, but the Linux distributions I've used have not improved that much (mainly Fedora and Ubuntu). I've been using Linux exclusively for many years and will probably continue to do so though...


I fell "in love" with the Budgie desktop experience, which is a project of Solus Linux. However, I didn't like the Solus Linux package system; I wanted the Debian plethora of packages. So I "found" Budgie Ubuntu. Best of both worlds, IMHO. It's the closest I've gotten to a Mac on Linux. It still has some rough edges, but nothing that really sticks out for day-to-day work and play.


If anything, font rendering in Linux is much better than in OSX. It does way more advanced auto-hinting and LCD-subpixel rendering, supports the newest font standards for things like advanced layouts, colored fonts and emojis, etc. etc. There's also very little display fragmentation other than the choice between Xorg and Wayland, which are largely complementary so far. But you can already be on Wayland-only in many cases.


>Inb4 the anecdotal "well it works for me I just had to download the xf86 font library and compile with a legacy glibc version..." crew comes in with a thousand and one rebuttals. Problems like that are still a suboptimal user experience, no matter how you slice it.

No, my experience is much more along the lines of, "I download and install Ubuntu or Linux Mint, and then it works without further issues."


I put Ubuntu Mate on an old 2009 Macbook because it couldn't function with MacOs anymore. A wifi driver was the only thing it needed. Which, ended up being trivial to install. Everything else worked out of the box. Two finger gestures on the trackpad, audio, everything.

My current Dell Inspiron 7000 didn't need a single driver installed manually (with Ubuntu). Pretty impressive.


Font rendering is a funny one. Some distros were limited by patent issues, but Ubuntu has looked great for a while. Now Apple removed subpixel anti-aliasing in 10.14. This may look better on hidpi, but lots of us still have non-retina Macs


Ubuntu and it's derivatives have pretty good out of the box font rendering IMO


> It works great when it works, but when it doesn't....

Yeah that good for MacOS and virtually every other piece of technology.


> Font rendering, display/compositor fragmentation etc...

So take a user from Windows 10 or OSX whatever, and sit them in front of your favorite stable Linux distro running a terminal emulator and a browser of your choice.

Are the fonts going to be rendered in a way that is unobtrusive for those users, or will things look ugly and difficult to read?

Because I'm running a chroot of Debian Buster on my Chromebook and boy, that terminal sure does look blurry. (And is this a bug that will be fixed when Buster stabilizes, or am I expected to go read some font wiki on a different machine to "guess-and-check" it back to sanity?)

I don't want to switch to OSX. But I also understand why someone wouldn't trust the UX of a system that ships in "headache-mode" by default.

Edit: clarification




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: