There's an HDR war brewing on TikTok and other social apps. A fraction of posts that use HDR are just massively brighter than the rest; the whole video shines like a flashlight. The apps are eventually going to have to detect HDR abuse.
I know how bad the support for HDR is on computers (particularly Windows and cheap monitors), so I avoid consuming HDR content on them.
But I just purchased a new iPhone 17 Pro, and I was very surprised at how these HDR videos on social media still look like shit on apps like Instagram.
And even worse, the HDR video I shoot with my iPhone looks like shit even when playing it back on the same phone! After a few trials I had to just turn it off in the Camera app.
I wonder if it fundamentally only really makes sense for film, video games, etc. where a person will actually tune the range per scene. Plus, only when played on half decent monitors that don’t just squash BT.2020 so they can say HDR on the brochure.
The HDR implementation in Windows 11 is fine. And it's not even that bad in 11 in terms of titles and content officially supporting HDR. Most of the ideas that it's "bad" comes from the "cheap monitor" part, not windows.
I have zero issues and only an exceptional image on W11 with a PG32UQX.
Also if you get flashbanged by SDR content on Windows 11 there is a slider in HDR settings that lets you turn down the brightness of SDR content. I didn't know about this at first and had HDR disable because of this for a long time.
The only time I shoot HDR on anything is because I plan on crushing the shadows/raising highlights after the fact. S curves all the way. Get all the dynamic range you can and then dial in the look. Otherwise it just looks like a flat washed out mess most of the time
You would think, but not in a way that matters. Everyone still compresses their mixes. People try to get around normalization algorithms by clever hacks. The dynamics still suffer, and bad mixes still clip. So no, I don’t think streaming services fixed the loudness wars.
What's the history on the end to the loudness war? Do streaming services renormalize super compressed music to be quieter than the peaks of higher dynamic range music?
Yes. Basically the streaming services started using a decent model of perceived loudness, and normalise tracks to roughly the same perceived level. I seem to remember that Apple (the computer company, not the music company) was involved as well, but I need to re-read the history here. Their music service and mp3 players were popular back in the day.
So all music producers got out of compressing their music was clipping, and not extra loudness when played back.
It hasn't really changed much in the mastering process, they still are doing the same old compression. Maybe not the to the same extremes, but dynamic range is still usually terrible. They do it a a higher LUFS target than the streaming platforms normalize to because each streaming platform has a different limit and could change it at any time, so better to be on the safe side. Also the fact that majority of music listening doesn't happen on good speakers/environment.
> Also the fact that majority of music listening doesn't happen on good speakers/environment.
Exacly this. I usually do not want high dynamic audio because that means it's either to quiet sometimes or loud enough to annoy neighbors at other times, or both.
I hope they end up removing HDR from videos with HDR text.
Recording video in sunlight etc is OK, it can be sort of "normalized brightness" or something. But HDR text on top is terrible always.
What if they did HDR for audio? So an audio file can tell your speakers to output at 300% of the normal max volume, even more than what compression can do.
HDR audio already exists in the form of 24-bit and 32-bit floating point audio (vs. the previous 16-bit CD standard). Volumes are still mapped to the same levels because anything else doesn't make sense, just as SDR content can be mapped to HDR levels to achieve the same levels of brightness (but not the same dynamic range, as with audio).
Isn't that just by having generally low volume levels? I'm being pedantic, but audio already supports a kind of HDR like that. That said, I wonder if the "volume normalisation" tech that definitely Spotify, presumably other media apps / players / etc have, can be abused to think a song is really quiet.
This is one of the reasons I don't like HDR support "by default".
HDR is meant to be so much more intense, it should really be limited to things like immersive full-screen long-form-ish content. It's for movies, TV shows, etc.
It's not what I want for non-immersive videos you scroll through, ads, etc. I'd be happy if it were disabled by the OS whenever not in full screen mode. Unless you're building a video editor or something.
I would love to know who the hell thought adding "brighter than white" range to HDR was a good idea. Or, even worse, who the hell at Apple thought implementing that should happen by way of locking UI to the standard range. Even if you have a properly mastered HDR video (or image), and you've got your brightness set to where it doesn't hurt to look at, it still makes all the UI surrounding that image look grey. If I'm only supposed to watch HDR in fullscreen, where there's no surrounding UI, then maybe you should tone-map to SDR until I fullscreen the damn video?
Yup, totally agreed. I said the same thing in another comment -- HDR should be reserved only for full-screen stuff where you want to be immersed in it, like movies and TV shows.
Unless you're using a video editor or something, everything should just be SDR when it's within a user interface.
Sounds like they need something akin to audio volume normalization but for video. You can go bright, but only in moderation, otherwise your whole video gets dimmed down until the average is reasonable.
Every phone has it, it’s called “power save mode” on most devices and provides additional advantages like preventing apps from doing too much stuff in the background. =)
That's true on the web, as well; HDR images on web pages have this problem.
It's not obvious whether there's any automated way to reliably detect the difference between "use of HDR" and "abuse of HDR". But you could probably catch the most egregious cases, like "every single pixel in the video has brightness above 80%".
Funnily enough HDR already has to detect this problem, because most HDR monitors literally do not have the power circuitry or cooling to deliver a complete white screen at maximum brightness.
My idea is: for each frame, grayscale the image, then count what percentage of the screen is above the standard white level. If more than 20% of the image is >SDR white level, then tone-map the whole video to the SDR white point.
HDR has a slight purpose, but the way it was rolled out was so disrespectful that I just want it permanently gone everywhere. Even the rare times it's used in a non-abusive way, it can hurt your eyes or make things display weirdly.
I agree that HDR has been mostly misused, but on the other hand the difference between the color space sRGB and the wider-gamut rendering enabled by the Rec. 2020 encoding of the movie is extremely obvious for me (sRGB has a very bad red primary color, which forces the desaturation of the colors in the yellow-orange-red-purple sector, where the human eye is most sensitive to hues and where there are many objects with saturated colors, e.g. flowers, fruits, clothes, whose colors are distorted by sRGB).
Because I want the Rec. 2020 and 10-bit color encoding, I must also choose HDR, as these features are usually only available together, even if I do not get any serious advantage from HDR and HDR-encoded movies can usually be viewed well only in a room with no light or with dim light, otherwise most of them are too dark.
> Wouldn't that just hurt the person posting it since I'd skip over a bright video?
Sure, in the same way that advertising should never work since people would just skip over a banner ad. In an ideal world, everyone would uniformly go "nope"; in our world, it's very much analogous to the https://en.wikipedia.org/wiki/Loudness_war .
sounds like every fad that came before it where it was over used by all of the people copying with no understanding of what it is or why. remember all of the HDR still images that pushed everything to look post-apocalyptic? remember all of the people pushing washed out videos because they didn't know how to grade the images recorded in log and it became a "thing"?
eventually, it'll wear itself out just like every other over use of the new
HDR videos on social media look terrible because the UI isn’t in HDR while the video isn’t. So you have this insanely bright video that more or less ignores your brightness settings, and then dim icons on top of it that almost look incomplete or fuzzy cause of their surroundings. It looks bizarre and terrible.
Imo the real solution is for luminance to scale appropriately even in HDR range, kinda like how gain map HDR images can. Scaled both with regards to the display's capabilities and the user/apps intents.
It's good if you have black text on white background, since your app can have good contrast without searing your eyes. People started switching to dark themes to avoid having their eyeballs seared monitors with the brightness high.
For things filmed with HDR in mind it's a benefit. Bummer things always get taken to the extreme.
I only use light themes for the most part, and HDR videos look insane and out of place. If you scroll past an HDR video on Instagram you have a, eyeball-searing section of your screen because your eyes aren't adjusted to looking at that brightness, and then once you scroll it off the screen and you have no HDR content, everything looks dim and muted because you just got flashbanged.
That does not sound enjoyable and seems like HDR abuse.
The "normal" video should aim to be moderately bright on average, the extra peak brightness is good for contrast in dark scenes.
Other comments comparing it to the loudness war ar apt. Some music streaming services are enfircing loudness normalization to solve this. Any brickwalled song gets played a bit quieter when the app is being a radio.
Instagram could enforce this too, but it seems unlikely unless it actually effects engagement.
Not sure how it works on Android, but it's such amateur UX on Apple's part.
99.9% of people expect HDR content to get capped / tone-mapped to their display's brightness setting.
That way, HDR content is just magically better. I think this is already how HDR works on non-HDR displays?
For the 0.01% of people who want something different, it should be a toggle.
Unfortunately I think this is either (A) amateur enshittification like with their keyboards 10 years ago, or (B) Apple specifically likes how it works since it forces you to see their "XDR tech" even though it's a horrible experience day to day.
99% of people have no clue what “HDR” and “tone-mapping” mean, but yes are probably weirded out by some videos being randomly way brighter than everything else