To paraphrase Bill Gates (who never actually said the original, but anyway) 4K should be enough for everybody.
Having seen 1080p stretch and play nicely on a 30 feet cinema screen, and not being much worse looking from regular Hollywood titles even for front seat viewing, I don't see the allure of 8K even for "future-proofing".
Sure, monitors and tvs might improve their resolution in the future. But I don't se human eyes improving much (regarding angular resolution vs distance) or houses getting any bigger to fit a 30ft tv.
4K is good for reframing (cropping) and higher detail, but after some point enough is enough.
I spoke to a director friend of mine a few years back about shooting in 4K. He says that when he works (whenever budget allows), he would always shoot in 5K, not because he wanted the extra resolution for the full frame but because he wanted the ability to crop the shot down without losing resolution. Some shots would be scaled down from 5K to 4K, but others would be cropped to 'zoom in', or allow for minor panning that wasn't present in the original camera work.
8K presumably provides the same benefits but to a greater scale; you can scale it down to 4K, you can 'zoom in' on parts of the shot (such as a subtle detail, a pickpocket stealing a wallet for example) without having to use a second camera or record the shot twice, and so on.
My phone (google pixel) does something like this for it's image stabilization.
When you film in 1080p it's actually a bit more zoomed in than you are expecting because it's really using the whole 4k+ sensor and applying some fancy coding to that to pan and zoom around in that space to provide some stability.
Seriously, sure, 4K is enough for output but who says it's enough for input? As long as sensors keep getting better, the industry will keep finding ways to take advantage of it until it's essentially required.
Imagine a future where you can zoom in on any detail as well as you could with a high-res sensor at capture time?
It's not necessary for today's viewing experiences, but we know little enough about what is going to become popular that I wouldn't put ANY bets on "4K" being enough forever.
I sometimes help film live events for a friend when he needs extra hands. Getting the framing perfect is really hard unless you're very good with the camera. Panning to follow a moving subject while simultaneously making sure everything else in the shot is desired is... really hard. Imagine being able to simple film a "general area" in 8K and the post-production team could handle all the framing/panning and still get 1080 output. That would be an amazing feature of filming in higher resolution. So the higher the better... let's get there, and make everyone's life easier.
>So the higher the better... let's get there, and make everyone's life easier.
Not really. For one, 4K already makes life more difficult, adding huge trnascoded files, slow renders, proxy workflows, etc. That's on a pure technical level.
On an aesthetic level, it's a bad habbit too and cropping in post is a lazy copout or an emergency kludge. Setting your frame is more than just "getting what you want in the final output", it affects parallax, compression, etc. Croping from 2x the frame is not the same optically. And deciding in post means a less well thought out frame in the first place. So, it's nice as a emergency kludge to save ones ass, for documentary, ad work or news, but not so good for movies.
I use my 40" 1080p tv as a second monitor from time to time, and it's way too ugly to be deemed as 'good enough' IMO.
Perhaps action movies won't be much improved with better resolution, but I'll want screencasts, browser windows, nature documentaries or really any detailed image to be shown in 4K at least.
Actually I'd want screen to stop being "TVs" or "computer monitors", it should be just a high def big rectangle than can be used to display anything. For that we'd need to come at least to retina level of text rendering from any comfortable distance. That should mean 8k as standard def in my book.
>Actually I'd want screen to stop being "TVs" or "computer monitors", it should be just a high def big rectangle than can be used to display anything.
The problem with this is that things you want in TVs are different than things you want in monitors. Its very rare that you need a 120+hz refresh rate, 1ms response time television, but you do often want that in a computer monitor, and response time especially is very, very expensive to reduce (modern monitors often have <5ms response time, televisions will often have 10x that).
> Actually I'd want screen to stop being "TVs" or "computer monitors", it should be just a high def big rectangle than can be used to display anything.
I would love to see this as well.
My ideal "television" has at least 8 HDMI inputs plus a couple of composite or component video inputs. There would be no cable/antenna/tuner input. There would be buttons on the remote for power, volume, input, plus menu and enter to access the settings. There would be a handful of audio outputs, and a switch to disable the internal speakers, which would allow it to feed into an audio system for those that like to have that separate. It would come in a variety of sizes, but only in 4K or 8K resolution.
This doesn't have a tuner or any TV-specific features, nor does it have any of the Smart TV stuff. People who need those are not the target market for this TV.
Amen. My current TV has the Android TV stuff so tightly integrated that when you're making the CPU work hard, there's huge delays switching inputs or settings.
Unfortunately buying a TV without the smarts also means sacrificing the DVB-T tuner, as well as the price being significantly higher. It looks like the commercial version of a ~2012 panel (http://pro.sony.com.au/pro/product/professional-displays-bra...) retains at least some of the smarts. I like the idea of driving a TV via RS232 but the replacement for the tuner - a HDHomeRun - is sold at ridiculous prices in Australia.
I bought my 4K TV at $800 - at RRP of $1800 I would have been really angry but I can buy a 4K HDMI switch and get over it.
Actually, a lot of the lower end Vizios act like this now. They are essentially a monitor with chromecast. And people seem to rip them in reviews for that...
For traditional 2D screens, I tend to agree with you - that 4K is quite enough. But for Virtual Reality, these higher definitions are a must. 8K per eye will be much better than 4K per eye - at least that's the thought at the moment.
Mostly agree - but even with a screen displaying 2D, how is different. I get the whole "your eyes can only see n number of pixels" what in the world difference does that make? You can't see individual bits of matter photons are reflected off of, but the collection of mater that builds up that "image" to your eye.
Sure the increase in resolution is "imperceptible" in terms of individual elements building up a particular frame, but to assume a visual cortex can't discern increased detail is just arrogant and borderline laughable
If resolution was the only metric, that makes sense.
But reality is that we're seeing higher color depth ("HDR"), higher framerates, higher quality sound streams (e.g. Dolby Atmos and beyond), and even technologies we aren't envisioning today because the hardware isn't "there" yet.
If 8K is only a resolution bump, then boring, but it may be much more than that. 4K is.
Shoting on lower resolution gives you better color depth/accuracy AND higher framerates to chose from.
It gives better color because (all other things being the same) signal and color accuracy degrade with higher resolution -- less photons "hit" each diode on the sensor.
And there's literally no digital cinema or amateur camera that doesn't allow for much higher framerates in lower resolutions than it does on higher resolutions. That's because the higher the resolution the more taxing it is for the sensor and processor in the camera to keep up with high frame rates. 4K x N fps means 4 times more volume than 1080p x N fps.
Lastly, sound is totally orthogonal, because that's recorded off camera with special recorders.
So we might be getting better color depth and sound quality lately, but that's not because we moved to higher resolution.
4k and 8k is going to be more useful with large and extremely large screen sizes. HD is perfectly designed for traditional TV use cases - with a screen on a table in a living room.
4k and 8k means a completely different use model, like a giant wall-sized screen in the same living room. It's the difference between a movie theatre and an IMAX movie theatre.
When that happens, cinematography can expect to be different, sorta like IMAX movies have different cinematography. Sometimes they even composite multiple HD films together on screen without losing resolution. This would actually be extremely useful for sports, where the background shows the entire field at 8k, with windows showing stats or replays, etc..
Having seen 1080p stretch and play nicely on a 30 feet cinema screen, and not being much worse looking from regular Hollywood titles even for front seat viewing, I don't see the allure of 8K even for "future-proofing".
Sure, monitors and tvs might improve their resolution in the future. But I don't se human eyes improving much (regarding angular resolution vs distance) or houses getting any bigger to fit a 30ft tv.
4K is good for reframing (cropping) and higher detail, but after some point enough is enough.