Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Linux desktop environments do use the 2x/3x scale approach for high-DPI by default. If GNOME detects a high-DPI display, it'll render everything scaled larger by an integer factor.

1440p represents the one case that integer scaling doesn't quite handle right, because you don't typically want it to scale 2x (showing the same amount of information as a 720p screen), you want it to scale roughly 1.33x (showing the same amount of information as a 1080p screen). So, on a 1440p screen, I disable the automatic high-DPI scaling (which does make the display readable, it just shows less information than I'd like), and use other mechanisms to scale by 1.3x.



Linux desktops do the hard part out of the box: scaling everything by an integer factor. But then there is no built-in way to downscale the result to a usable size, which should be relatively easy?

For example, if you need more space on a 5K iMac than the default 1440p@2x, you can have macOS render everything at 1800p@2x off-screen, then downscale it to fill the screen.

The UI looks like this: https://support.apple.com/en-us/HT202471

I think it can be done with xrandr, but IMHO it should be a built-in option in GTK-based desktops too.


Ah, I see.

I don't think that would work well with Linux expectations for rendering, such as crisp edges for fonts; if you like Apple-style rendering of fonts and other UI elements, it may work better.

I'd rather see everything rendered for the display resolution I have, and just scaled at a non-integer factor.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: