Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Anecdotally my house draws 0.4 kW when idle and 0.6-0.7 kW when both my 8K screen and my computer are on. Since my computer draws 0.1-0.2 kW, I surmise that the QN800A doesn't draw 300-400 W total --- maybe 100-200 W.

I run my screen on a brightness setting of 21 (out of 50) which is still quite legible during the day next to a window.

Also, I have solar panels for my house (which is why I'm able to see the total power usage of my house).



RTINGS reviewed the Samsung QN800A as consuming 139W typical, with a 429W maximum: https://www.rtings.com/tv/reviews/samsung/qn800a-8k-qled#tes...

The parent comment is completely wrong on nearly every point it makes. I don't know why it's so upvoted right now.

It doesn't even pass the common sense test. Does anyone really think TVs have 200W CPUs inside just to move pixels around? That's into the territory of a high-end GPU or server CPU. You don't need that much power to move some pixels to a display.


Surely they'd be using image processing ASICs instead of CPUs anyway, hence why they don't draw that kind of power!


I didn't smell anything. A 200W PSU isn't terribly expensive and being cheaper than more efficient processors seems reasonable. I also only run a single 4k monitor so haven't thought about driving 4x the pixels recently.


> I didn't smell anything. A 200W PSU isn't terribly expensive and being cheaper than more efficient processors seems reasonable

200W is the realm of powerful GPUs and enthusiast/server CPUs.

Common sense would rule out an 8K TV requiring as much power as an entire gaming GPU just to move pixels from the HDMI port to the panel.


And certainly it would require active cooling. TV with audible fans could be a hard sell!


Maybe these TVs are using salvaged Pentium4 CPUs...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: