Hacker News new | past | comments | ask | show | jobs | submit login

There are generally TWO things going on, and we use one term. I think that’s because generally displays do both so it’s just easier.

One is that HDR displays can usually get MUCH brighter. An iPhone 5 could do 500 nits. An iPhone 14 Pro can do 1,000 - 2,000 nits.

That’s what’s used here. You get all the same colors as before, but max brightness is way brighter.

The other thing is color gamut. The standard we had for a long time was 24-bit color, so ~16.7 million colors. Instead of having 8 bits per channel, screens may now have 10 or 12 bits per channel. I can’t find just how many a modern iPhone has.

This means there are more shades between black and 100% bright red. There are more variations between blueish green and greenish blue. Gradients can be smoother. Objects that are mostly one color (yellow corn, a red apple, etc) now have more options the can use to provide definition and details.

In a very dark scene, there are now more dark shades to show things with. When looking at bright clouds, they don’t have to be all white and washed out.

And combined with the increased brightness a scene can show definition in both dark and bright areas without having to wash everything out.




To clarify slightly - the max brightness is the real feature, and everything else including more bits per channel is just a bonus used to describe colours that are dimmer or brighter in more detail than was possible before. Essentially more bits per channel means greater precision, but the real game changer is how the range has expanded and you can display more details in dim scenes, more details in bright scenes, and, very key for the HDR effect, to have both very bright and very dim content display next to each other. As a side effect, just like how you don’t want your seats to rattle in a movie theatre every time music plays, there’s a difference in visual brightness between “normal” sRGB content and brighter (or dimmer) than usual P3 content. This QR code trick exploits that difference.


BTW there are, or at least were, wide gamut only displays.

I have one Dell made a few years ago. It supports the additional colors of P3 but isn’t any brighter than any other quality normal display of the time.


Technically, then, that display is like OLED TVs pre-2018, which could only display about 70-85% of P3 wide gamut because they could not go bright enough across the entire display. They were amazingly bright if a small part of the panel needed to be lit, though, with excellent contrast ratios. By comparison, most LED panels that are wide gamut but don’t have dimming zones can only display a uniform brightness or dimness and often struggle to make colours pop against darker backdrops the way they do on MacBook screens (with dimming zones) or iPhones (with OLED screens). Point is, just because something is wide gamut doesn’t mean it can display the entire gamut nor does it mean it can display the entire gamut at the same time. Its quality can vary by image and lighting technology (since the quality and brightness of colours all rely on their lighting source, etc.) as well as the image content demanded to be displayed and the lighting of the room (since your eyes have to perceive the screen too, against whatever backdrop or lighting your room has).


That makes sense, thanks!




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: