Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Or try out Thinkpads or XPSses with some flavour of Linux but I don't have high hopes for their touchpads and screens / HiDPI software support.

Touchpads generally work perfectly out of the box, with any Linux distribution that uses libinput instead of one of the older drivers. HiDPI screens also tend to work perfectly out of the box, though you may wish to tweak them to your tastes. (For instance, I don't want to treat a 1440p screen as 2x 720p, I want to treat it as 1440p with only slightly enlarged fonts.)



I'm curious why Apple's approach of drawing things at 2x or 3x and then downscaling the whole screen hasn't caught on in the Linux world. It seems like scaling the result (a single buffer) should be extremely easy? Laptops and screens targeted at Windows 10 mostly have crappy resolutions for both running at 2x or 1x.


Linux desktop environments do use the 2x/3x scale approach for high-DPI by default. If GNOME detects a high-DPI display, it'll render everything scaled larger by an integer factor.

1440p represents the one case that integer scaling doesn't quite handle right, because you don't typically want it to scale 2x (showing the same amount of information as a 720p screen), you want it to scale roughly 1.33x (showing the same amount of information as a 1080p screen). So, on a 1440p screen, I disable the automatic high-DPI scaling (which does make the display readable, it just shows less information than I'd like), and use other mechanisms to scale by 1.3x.


Linux desktops do the hard part out of the box: scaling everything by an integer factor. But then there is no built-in way to downscale the result to a usable size, which should be relatively easy?

For example, if you need more space on a 5K iMac than the default 1440p@2x, you can have macOS render everything at 1800p@2x off-screen, then downscale it to fill the screen.

The UI looks like this: https://support.apple.com/en-us/HT202471

I think it can be done with xrandr, but IMHO it should be a built-in option in GTK-based desktops too.


Ah, I see.

I don't think that would work well with Linux expectations for rendering, such as crisp edges for fonts; if you like Apple-style rendering of fonts and other UI elements, it may work better.

I'd rather see everything rendered for the display resolution I have, and just scaled at a non-integer factor.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: