Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

8k@120Hz is going to need one heck of a video card


It's venturing into cryptocurrency-space-heater levels of pointless number crunching to render at that level of detail for anyone who has human eyes.


> It's venturing into cryptocurrency-space-heater levels of pointless number crunching to render at that level of detail for anyone who has human eyes.

Depends on your monitor size.

Might be a waste at 27", but if you want to use a 48" display, I can assure you that you'd notice the move from 4k -> 8k.


Yeah, even driving the display at 4K, you start to notice the higher pixel fill factor for 8K displays above 48in. I love my 65in 8K Q900 - even though it mostly lives at 4k (120hz!).


Can you explain this a bit more, I tried googling but I can't quite understand what you mean here by pixel fill factor and how it would differ between the resolutions?


The gaps between the pixels tend to be smaller the higher the resolution - so even if you drive the display at a lower resolution, it can look better the same lower res display that has larger spacing between each subpixel.


Interesting, does this give you a noticeable benefit for text, or does this mostly apply to images or video?


User interfaces shouldn't load a GPU very much even at that resolution.

8K in a game will use tremendous amounts of power, but it's not pointless. It's like having antialiasing turned on. And high frame rates are important for motion because normal rendering only gives you a single point in time.


I’d argue that AI driven super resolution like DLSS should be more than sufficient to upscale 4k to 8k with minimal performance loss and acceptable image quality even for gaming.


Not for tmux + firefox


Unless people are far more sensitive than I am, I don't see how >60Hz is needed for a desktop workstation environment. High frame rate is really only noticeable for very fast reaction time gaming.

4K 120Hz may be noticeable if editing ultra high frame rate video on a video editing workstation, but if you are a video production crew with a camera capable of recording at that framerate, you probably already know that.


I can immediately tell just from moving the mouse a little. I wouldn't say it's needed either though.


When I throw my iPhone into low power mode and it drops to a 60FPS cap, it is immediately noticeable.


Many people are more sensitive than you are. I can easily tell the difference between 60 and 120 on both my phone and my desktop.

Though response times also matter a lot for ghosting and such.


Yeah, I'm a gamer and I definitely notice the difference between 60 hz and 120+ hz.

But for desktop productivity? I don't feel I gain anything from it. 60 hz is fine.


60Hz is really noticeable when you’ve been using 120Hz even for a few minutes. 120Hz feels a lot less tiring and work in a terminal, editor and websites is just a lot smoother.


It's immediately noticeable when scrolling in a browser, dragging stuff around or just moving the mouse. If you haven't seen it in person, go to an Apple store and do a quick comparison between the Macbook Pro (120hz) and the Air (60hz), or iPad vs iPad Pro. They're always next to each other.


Not in text mode. Hercules FTW


Not for productivity.


Not if you ignore overhyped (imo) rtx graphics and return back to gaming worlds, instead of smoke-and-neon-lights-in-mirrors pseudorealism.


I'd likely not be gaming at 8K for a while. But for productivity tools it'd be amazing.


ARM macs can probably handle that, the M2 can do 10 8k video streams at once, 22 simultaneously on the Ultra, and people are running 4k 120hz on the M1 with a couple hacks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: