Dirty secret of video: The resolution matters a lot less than people tend to claim. Pixels are not created equal. I can easily give you a "24K" video stream with the same bitrate as your current 720p stream. It may choke your video card dealing with it and you may have nothing that can display it, but I can give it to you, easily. It just won't look any better than the 720p stream and will probably look worse (due to the compression algorithm blocks now being too small to work properly).
I've seen "1080p" streams that were of lower quality than a DVD release, to say nothing of the BluRay you mention, which are better than DVDs both in encoding quality and bitrate, making them actually quite a bit better than DVD.
Bitrate matters a lot more. Only once your bitrate has saturated a given resolution is it really worth moving up to the next resolution.
I actually like "4K" streams because to even pretend to be 4K, they have to bump the bitrate up. Prior to me getting a 4K TV, I played some 4K streams to my 1080P display, and they were of a visibly higher quality than the 1080P streams. You're not "supposed" to do that test; it really breaks the 4K illusion. (You can do this with youtube-dl by forcing it to grab the 4K stream and then playing it on something with enough hardware acceleration to still play it at 1080P. You can't just go to YouTube with a 1080P display because it'll feed you the 1080P version of the stream even if the video name has "4K" in it.)
(Another place you can see this is cameras, especially cell phone cameras. Take your cell phone outside into broad daylight and take a well-focused photo of pretty much anything. Take the photo in to your computer and zoom in until every pixel of the photo is three or four pixels on your monitor. Can you find sharp color changes between those pixels, or does it look like your camera snuck an impressionist Instragram filter into your picture when you weren't looking? A lot of us have high end phones that may be able to pass this test, though as the megapixel count keeps jumping even that may be dubious, but even mid-range phones can't. That said, they may still be taking very, very good photos, especially considering all the headwinds a phone camera faces optically. It's just that they're very good 4MP photos rather than 12MP photos.)
(I'd love some sort of JPEG processor that analyzes a photo for its real information/high-frequency content, and then downsizes the JPG to accurately reflect its real information content. You can get big savings on both digital camera photos and videos by doing this, integer multiples of the file size with no visible differences when examined side-by-side, but it's quite tedious by hand. Perhaps the promise of BIG SAVINGS would encourage a DropBox or some photo startup employee to have a look at this?)
> Dirty secret of video: The resolution matters a lot less than people tend to claim.
Precisely. Video compression is about reducing redundancies. A perfect lossless video compressor would essentially take a bitmap format and turn it into a vector format. Vector formats have no inherent resolution.
Now real life compressors are neither lossless, nor perfect, but it's easy to see that the better they better they become, they more and more become resolution-independent, in a way.
Compression performance depends on the ratio between the information entropy of the input and output signal, which only loosely correlate with resolution.
> Another place you can see this is cameras, especially cell phone cameras.
Yeah, you can also see this with medium format camera backs. They will produce better images at a given target resolution than a full frame sensor at the same target resolution. Most people don't expect an old medium-format back that "only" has 30Mpix to be any better than a current full frame sensor of the same resolution, but blow images at 100% and the medium format simply has more detail.
I want to know what sort of cameras can consistently capture the kinds of photos I see online where everything is just refreshingly sharp and properly lit. I know user skill is a big part of that but for a while I've wondered what sort of hardware to optimize for to get shots that are not amazing but just properly well-focused.
I also want to do a similar thing to your "JPEG irreducible complexity" test: I want to, given a series of photos, find the one with the least artifacts. This is mostly for JPG as well, but for photos that can rank stuff that's gone through the Twitter/etc can opener (and been decoded and reencoded umpteen times in the process), so it's possible that doing purely codec-level analysis may or may not rank accurately.
I've seen "1080p" streams that were of lower quality than a DVD release, to say nothing of the BluRay you mention, which are better than DVDs both in encoding quality and bitrate, making them actually quite a bit better than DVD.
Bitrate matters a lot more. Only once your bitrate has saturated a given resolution is it really worth moving up to the next resolution.
I actually like "4K" streams because to even pretend to be 4K, they have to bump the bitrate up. Prior to me getting a 4K TV, I played some 4K streams to my 1080P display, and they were of a visibly higher quality than the 1080P streams. You're not "supposed" to do that test; it really breaks the 4K illusion. (You can do this with youtube-dl by forcing it to grab the 4K stream and then playing it on something with enough hardware acceleration to still play it at 1080P. You can't just go to YouTube with a 1080P display because it'll feed you the 1080P version of the stream even if the video name has "4K" in it.)
(Another place you can see this is cameras, especially cell phone cameras. Take your cell phone outside into broad daylight and take a well-focused photo of pretty much anything. Take the photo in to your computer and zoom in until every pixel of the photo is three or four pixels on your monitor. Can you find sharp color changes between those pixels, or does it look like your camera snuck an impressionist Instragram filter into your picture when you weren't looking? A lot of us have high end phones that may be able to pass this test, though as the megapixel count keeps jumping even that may be dubious, but even mid-range phones can't. That said, they may still be taking very, very good photos, especially considering all the headwinds a phone camera faces optically. It's just that they're very good 4MP photos rather than 12MP photos.)
(I'd love some sort of JPEG processor that analyzes a photo for its real information/high-frequency content, and then downsizes the JPG to accurately reflect its real information content. You can get big savings on both digital camera photos and videos by doing this, integer multiples of the file size with no visible differences when examined side-by-side, but it's quite tedious by hand. Perhaps the promise of BIG SAVINGS would encourage a DropBox or some photo startup employee to have a look at this?)