Really interestingly I've been using a Windows, a Linux and a MacOS machine for many months, swapping them often. Well, Windows/Linux are on the same Ryzen 2700X with a Geforce 1080Ti, the Mac is a 2018 Mini with 32G RAM and 6 core i5. 4k 32" display in each case.
What surprised me:
- Windows UI is WAY the fastest. Linux is the slowest, and with fractional scaling turned off its hardly tolerable
- the font rendering on Windows is perfect, while MacOS is a bit blurry. This is very surprising to me since all the hype around the good scaling of MacOS. Windows hidpi fonts are just perfectly sharp.
- MacOS is absolutely consistent when it comes to rendering and scaling, the others aren't
It's blurry because they use greyscale anti-aliasing and do not at all try to make the pixels fit into the pixel grid. Windows does try to make them fit into the pixel grid.
Personally, I've turned off AA on my PC (and it's a completely useless feature on high resolution displays). What surprises me is as we have been moving into higher resolution displays, the OS makers have been making it HARDER to turn off anti-aliasing of text and this includes newer versions of Qt which have it turned on apparently so programs using Qt now pick up anti-aliasing even if your PC has it turned off.
People go on about how they love their blurry fonts. I don't get it. I like crisp and sharp.
PS. The version of the font, and the font used also makes a difference. A number of them were made during an era when anti-aliasing wasn't as common, so those are hinted to work well without AA, but I have also run into situations where an older version of a font works great with AA off and the newer version of it doesn't because they screwed something up or removed the hinting in the newer version. So if you have a really good font, back it up.
I have a 4K Display on a Mac and I decided to go with 4x scaling (1920x1080) so that macOS can use Retina. Much sharper. My colleagues go with 2500x or so and I don’t understand how they can stand it.