Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not the original commenter but, it... has a certain look to it. It's not something that can be totally solved by the OS or any scaling algorithm because fundamentally you have to deal with rendering fractional portions of a pixel to a physical screen where there is no such thing as a fractional pixel. It just always looks different, and IMO worse, than integer scaling.

You have to trade off distortions for bluriness and you can't avoid both of those artifacts at the same time. The higher your pixel ppi, the less noticeable any scaling artifacts will be though.



I don't understand why 'distortion' (things being rendered ± 1 pixel over on some screens vs. on others) is a problem unless your GUI frameworks and/or apps count on pixel-exact layouts for some UI elements. But why would they? Isn't the entire web built on a reflowable format that works pretty well? Shouldn't those tiny 1-pixel differences be like the easiest possible variation for a GUI system's layout engine to cope with?

Do we have lots of scalable UI elements that expect to line up with raster images a certain way on most operating systems?


For me, a 1 pixel asymmetry in a button at medium ppi is noticeable and mildly distracting. I don't mind it much for UI, but I understand why others would want to avoid it. I tend to set my ui and toolbars to auto-hide. Personally, the reasons why I avoid it are text rendering and gaming usage.


This post has more information, as well as tons of other related info: https://tonsky.me/blog/monitors/


This was really helpful, thanks!

It's a shame (for me, who can't afford HiDPI displays to replace his current ones, and would have difficulty pushing all those pixels even if he could) that Apple removed subpixel anti-aliasing. :(




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: