Hacker News new | past | comments | ask | show | jobs | submit login
The best 4K monitor doesn’t exist yet (thewirecutter.com)
79 points by smacktoward on June 17, 2014 | hide | past | favorite | 86 comments



  ...though you’d be foolish to buy a 24-inch 4K display, we can 
  only hope that Intel and Samsung’s ambitions can push down 
  prices on larger displays.
Unless you're a Mac user, in which case you'd be foolish to buy anything else.

Reposting from my comment last year:

At 28", a 3840 x 2160 panel has a PPI of 157, which sits right between Retina and non-Retina densities. This means that on a Mac, you’ll have to use it one of two ways: Either at 1x, where the higher PPI means everything will be much smaller than it is on a normal monitor, or at 2x, where the lower PPI means everything will be much bigger than normal.

...

The best 4K monitor for Macs will be a 24”, which will have a PPI of 184, just about right for something sitting a bit further back from the viewer than a 220 PPI Retina Macbook Pro display.


I run a Dell 31.5" 4k and find it a perfect fit with the rMBP running at the (effective) 1920x1200 resolution. The Dell runs around 140ppi and the effective ppi of the rMBP is around 140 also. I find it to be an excellent combination. The quality of the Dell is outstanding.

A co-worker got two of the Dell 28" 4Ks and they were too small for 1:1 and just a waste for 1920x1080. Anything in between was jagged. The 30Hz was pretty terrible also.

Maybe two 24" displays would be ok, but I'd rather have the extra pixel real-estate in a single display instead of smoother font and graphic rendering.


Can you please mention the full model number of the 31.5" dell that you are talking about? Or a link?


What I'm still waiting for is a 1x (~110-120 ppi) 3k (2880 to 3k horizontal pixels) monitor so I can do true dual windowing. I'm currently using a 2560x1440, and horizontal space is a little tight. I don't want to use a 4k monitor as a retina display since I want to just see more pixels w/o multiple monitors. And 4k monitor w/o retina (38"+) is too big for normal viewing. So the ideal size is 2880 to 3k horizontal pixels in 27" to 30".

The only option seems to be the Auria eq278c[1] and eq308c[2], both resolutions are 2880x1620. But since it came out less than 1-2 months, there isn't any reviews for it yet. Its cheaper eq $400 eq276wn claims to be 2880x1620, but reviews says it's actually a 2650x1600, so I'm not going to plunge $750-$1k until there's a good reliable review for it. The other alternative is the new LG 3440 x 1440 [3], but it's not as ideal as 2880x1620 since it requires a bit more head tilting since it's much wider and harder to do 2 vertical windows editing at 1440 vertical pixels.

1. http://www.amazon.com/dp/B00KG5LWM4 - eq278c, 27"@2880x1620 = 122ppi, $750

2. http://www.amazon.com/AURIA-EQ308C-30-2880-1620/dp/B00KG5LSB... - eq308c, 30"@2880x1620 = 110ppi, $1055

3. http://www.amazon.com/gp/product/B00JR6GCZA - LG 34UM95, 34"@3440x1440 = 110ppi, $999


Can't you run a 28" 4K panel at 2560x1440 using a HiDPI mode?

I run my 13" rMBP at 1440X900, which still looks pretty good. It actually looks pretty good at 1680x1050, except that everything is too small for comfortable work. Sure, it wouldn't be "full retina" resolution, but it would look a lot sharper than a regular 27" 1440p display. The only real constraint could be the video chip's ability to render at 2X that resolution and then downscale.


As a diehard Mac user, I still wouldn't want a 24" retina monitor because it has a lot less space than a 34" superwide non-retina.


Second this. I picked up a Dell 2414Q (on sale in Canada for $750 CAD) a few weeks ago and it's perfect as a 2x retina 24" external Macbook Pro monitor.

Support on Windows and in games is spotty though. In particular, Windows doesn't seem to have a way to do exact pixel doubling.


Windows basically moves this responsibility to the developer. Apple's approach for this matter has been proven much more pragmatic - at worst you'll see a few assets pixelated but at the correct size, and to ensure retina compatibility, developers only need to define their assets twice.


24" is the perfect size for me. It's a drop-in replacement for a 1080p display, so literally a 'retina' upgrade.

Now to wait until all my core windows apps are either 4k native, or scaling compatible.


Retina means one thing when using a display 12" from you face and another thing entirely when using a display 24" from your face.


I guess as KDE user I was foolish to buy 3x30" :-). Setting font size works just fine...


Any trouble with window border sizes? I guess you can probably find 30 setting that help with that...

Are there any themes that help with that?


Most themes allow to set window border size.


What bothers me about 4k is the switching of terms.

4k makes it sound like it has 4 times the vertical resolution of 1080p, but for pure marketing reasons, suddenly instead of the 4k referring to vertical resolution as 1080 did, it refers to horizontal resolution, so is really only twice the horizontal resolution of the 1920 that 1080p had. Overall that's 4 times the pixels because of squaring, but still!


We really ought to insist on using more correct terms like UHD (Ultra High Definition = 3840×2160 pixels) and only referring to "true" 4K cinema (4096 × 2160 pixels) content/equipment as 4K.


Like "HD" (1080) vs "HD Ready" (720) I think we've already lost that battle, and yet again it's "buyer beware" where people will get sold inferior goods as if they're the good stuff by having the same or similar terms for different quality. (Although in this case it's not as large a difference as HD vs HD ready was)


Ive never seen "HD Ready" refer to 720. As I remember it, HD Ready meant that the TV was HD (720 or 1080), but it didnt include an ATSC or QAM tuner. It was more like a monitor than a true TV.

The distinction I've seen with 1080p is calling it True or Full HD.


Maybe this is specific per country, but in the Netherlands 'HD Ready' always referred to 720 instead of 1080.


Actually, it referred to 768, as that was the number of lines that the cheap panels had. Which meant that even to display 720p content you had to scale it.


There really needs to be an international standard for these terms. What's wrong with listing a display by its capabilities? 1920x1080 60hz 24bpp. That way we can get rid of all this HD, HD Ready, UHD, 4K nonsense. It's marketing speak only meant to confuse you into buying something you wouldn't have if you were better informed.


Listing by megapixels would make me happy. The number of diagonal inches has been the least helpful indicator of screen quality over the last decade, and the WUXGA nonsense is too hard to keep track of as well. Adopting a MP number would do more to get us high res screens than any other change.


We can do better, how about having marketing be based on

a) Number of pixels in scientific notation

b) Alternatively log base 10 of number of pixels


They do that for digital camera displays and it’s super confusing.


That was meant to be in jest - though i would probably chuckle if I saw a marketing banner that said "All new Monitors now with 5.02 x 10^7 pixels"


Eh, you got me. Digital camera disaply resolution is reported like that, worse, they don’t count pixels but “dots” (i.e. subpixels).

Example: the Olympus E-M10 has a 1,037,000 dots display, which means 345,600 pixels, which means 720x480. But all the marketing material (and review websites) reports “1 million dots display”.


If you used megapixels it wouldn't be any worse than camera sensors. I'm honestly surprised this hasn't been done as an alternative to VGA, SVGA, DVGA, WSVGA, XGA, WXGA, SXGA, WSXGA, UXGA, WUXHA, WQSXGA, WXUSXGA, qHD, HD, FHD, QHD, UHD, FUHD, etc. (several resolutions omitted)

A number of megapixels and an aspect ratio would give all the information you need, while boiling it down to easily comparable numbers.


General consumers often care more about physical size than actual resolution. Why else would 1920x1080 pixel computer monitors be sold in sizes up to 27 inches? What explains the proliferation of 15 inch, 1366x768 laptops, or even 17 inch 'laptops' of any resolution?


Definitely, I was just talking in terms of resolution. Megapixels and aspect ratio gives you the same information as width and height (in pixels).

1.05 megapixels vs 2.07 megapixels is a much more obvious comparison to most consumers than 1366x768 vs 1920x1080. One has twice as many pixels as the other.

Switching to that system could even make a few more people realize that a small number of pixels with a huge screen size isn't a great idea. IMO most consumers just see "1366x768" and think "Oh those are big numbers it must be fine."


It's not that the consumers cared more about physical size, it's that they literally didn't know any better.


Further aggravating: "4k" isn't "four thousand (and 96 in binary pocket change)", it's the marketing-by-association of noting that "true 4k" at 4096 is "close enough" as the consequence of taking a 4096x2160 display and feeding it 16:9 content, then trimming the resulting side black bars.

More detail at http://en.wikipedia.org/wiki/4K_resolution


2160p just doesn't roll off the tongue! People will get used to it - they've managed to deal with the nebulous ideas of "SD" and "HD" and "Full HD," which I would guess are fairly well used if not the dominant terms.

I'm more worried that there's going to be a 4K/"full 4K" thing because of the aspect ratios!

But mainly I'm just pumped to have a 4K monitor. As soon as they're as good as the mid-range Dell Ultrasharps in terms of color and options I'll be making the switch.


The term 4k has been around much longer than 1080p, etc. It is a term from computer graphics and the movie industry which means 4096x2160 (17:9 ratio) or 4096x3072 (4:3 ratio). I have helped build, test, and used "4k" tiled wall displays for scientific visualization. It is cool that you can now just buy a single monitor for the same purpose.


Considering most HD refers to 1080i or 720p, even more so here in Australia, I don't mind this so much. Remember, anywhere really, that broadcast or cable TV is hardly ever 1080p.

With this step hopefully killing off interlacing, I support any even somewhat dishonest marketing exercise to be justified.


4K is going to really not matter much for a while simply because the average computer doesn't have good enough graphics to make good use of it.

The resolution jump is huge and so you need a lot more GPU memory and processing power to do the same things at a higher resolution. Games are now really taking good advantage of 1080p, so we are another 5-10 years away from 4k being a real thing in terms of mainstream gaming.

Also, the average computer is not going to fare well in a 4k world. Sure, you might have videos on Netflix or some Blu-ray successor look nice in 4k, but good luck getting that to be a decent experience on the sub $500 laptop your parents just bought 2 years from now with a 15.6" 4k display with terrible viewing angles, a Celeron processor, integrated gfx, and a 5400 RPM 2TB Hard Drive...

High resolution is great, but as an industry we don't seem to be willing to upgrade the related components - memory, GPU, SSD unless it's on a mobile phone.


The killer app for hi-res displays is text, not video or games, in my opinion. Text looks terrific at 4K and text is what most working users look at all day. I would certainly trade some frame rate for super-crisp text in my editor. If others agree with me, we'll see high-res-but-slow-rate catching on in business before the home gamers are ready to go 4K.


I think this makes it a perfect opportunity for Apple to swoop in with Retina-based iMacs. It plays right into their style of curating all the hardware components and support developers via their SDKs to create a beautiful 4k experience for end-users.


A 27" 4k touch-screen iMac would be perfect.


With g-sync and similar we will probably see smooth gameplay at lower refresh rates, meaning 4k gaming can be a little closer than it would be if we wanted 4k @ 60Hz.

The real elephant in the room is the poor scaling of applications, on my 4k laptop screen on windows 8.1 I tried making it work for a few weeks using windows' scaling settings, but caved and just set it to 1/4 resolution and haven't looked back. On a small screen the benefit of more pixels just doesn't outweigh the downside of many apps being unusable.


It's a chicken and egg problem. Until enough people stop giving up on high resolutions, the application makers won't do adequate testing of those resolutions, ensuring that they'll continue to be unusable.


The ASUS PB287Q, AOC U2868PQU and Samsung U28D590D all support UHD resolutions at 60Hz and DisplayPort 1.2 SST. That means that they support working with just "one tile", they don't give the problems the previous dual-tile solutions.

I don't really see the problem with these new monitors that are good enough for most people. Good price, the TN panels are quite decent (confirmed by reviews both at The Tech Report (http://techreport.com/review/26510/4k-for-649-asus-pb287q-mo...) and KitGuru (http://www.kitguru.net/peripherals/monitors/zardon/aoc-u2868...).

Yes, you do need the best GPUs in order to take advantage of games in 4K. And yes, OS support is still incomplete. Anyway those trade offs are not that important if you are not a gamer and you can take advantage of that desktop area.

There's always good reasons for waiting. They could lead to never take advantage of improvements: you know, new improvements could be on the horizon when the ones you waited for finally arrive. And the story repeats itself...


I don't mind playing games in a lower resolution and letting it scale up to the monitor's res - I consider the stretching to 4K a form of "natural" (if sloppy) anti-aliasing.


Scaling is a problem on a lot of 4K monitors right now. IIRC the Dells will display everything at their native resolution - so a 1080p signal will sit as a box in the center of your screen surrounded by black.

There's a reason why the current batch of 4K monitors are considered immature. The lack of "real" (single-tile) input, the inability to handle 60Hz inputs, the inability to scale, etc etc. Looks like the latest round of model updates solve some of these, but IMO it's not quite time yet.


I have the Asus PB287Q and it scales perfectly to 1080p (and does the 60Hz input, single-tile input), etc, etc. The only downside is the TN panel but it's much better than any TN panel I've ever used.


Please tell us more.


GPU scaling has been a standard driver feature, on Windows at least, with both NVidia and AMD video cards for awhile now. It doesn't really matter what the monitor does.


Oh yes definitely. Don't worry, I'm not interested in jumping the gun.


On the one hand I can't wait for 4k and high PPI mode adaption. Retina Macbooks already show just how pretty it is.

But the whole thing doesn't seem ready yet outside the Mac ecosystem. I see lots of issues with 4k:

- Going from 120 hz back to 60 hz, or even less. Seeing how they are struggling to get a 60hz monitor to the market, it's probably going to be a long time until we see 120 hz monitors.

- Software adaption for high PPI modes is going to take time.

- Games at 4k at 120 fps are going to be a problem for the GPU.

- Movies are going to be the biggest bandwidth hog ever, or ridden with compression artifacts.

So I'm not terribly exited yet. I probably still going to end up being an early adapter and eat through all these issues, but I think I'll wait at least one more year.


The only reason makers are having an issue with hitting 60Hz is that people are still stupid and keep using hdmi and the monitor makers keep catering for them.


I absolutely love wirecutter. If they're not already making ends meet then I'd happily donate for this kind of honesty and quality reviews.


For those not familiar with The Wirecutter, is their concept "the best value-conscious buy"?

I was looking for a review for a MicroSD card to use with a Surface Pro convertible tablet I'm going to buy, and they didn't review 128GB or 64GB models, just 32GB and 16GB ones "because they're very expensive and most people don't need that much space on a microSD card". http://thewirecutter.com/reviews/best-microsd-card/


The idea is "best X for most people". They will occasionally recommend some higher-market stuff (IE $1600 for their recommended standing desk) but only when they feel the low-end stuff is junk. I've found them to be fairly reliable. When I've bought or already owned stuff they've recommended it generally has ranged from "decent value" to "great choice" Since they review a range of products for each thing and give tradeoffs its generally a great one-stop place to make decisions on things you don't want to spend a long time researching


More or less. In their own (maybe slightly breathless) words:

http://thewirecutter.com/about/



Thanks for that!


I imagine they make out pretty good given that everything has an Amazon referral link.


I didn't have trouble rationalizing my pre-order of this LG Ultra QHD (3440x1440) for $999. It's not 4K, but it's very competitive. http://www.amazon.com/gp/product/B00JR6GCZA/ref=oh_details_o...


FYI, as of this morning it's available at a couple Frys Electronics locations (including Sunnyvale) http://www.frys.com/product/8069024


I gave in and bought one of the Seiki 4K 39" displays for $330 over the weekend -- it's far from ideal, but it's $330. I'm sure I can resell it for >$200 when I get something better in a few months or a year, but it's an entirely serviceable monitor.

(The $330 deal was a $399 TigerDirect with a $70 MIR, free shipping, no CA tax. Ended Monday, but I'm sure it will return.)


I have one of those too, and for the price I've been very happy with it. My only real objection is the amount lag, but it's only really noticeable when web browsing, and even then it's just a little annoying at first.

In case you haven't already, make sure to turn the sharpness and noise cancellation down. It'll make it look a lot better. Also, if you look online, there are other firmware versions you can flash to speed it up.


i have one too. The problem I have with it is that its just too much screen real estate, horz and vertical together. I would have preferred one of those ultra wide monitors instead.


I'd like to know if anyone is using a 4K monitor and their experience with it. I have an Apple thunderbolt display and I am ok with it.

I do not have a 4K. I use it with macbook pro 15" (retina display) and even though I do not get the same resolution, it does not look flaky at all and is decent enough for reading. Moreover, it connects with the pro using thunderbolt and thus does not require multiple connections for speakers, camera, mic and display.


I have the Sharp 4K monitor that Apple sells in its stores and it's incredible. I run it at retina resolution at 60hz (now possible on Mac OS 10.9.3+) with a Mac Pro and it looks amazing.

Very pricey, but in my experience, worth every dollar.


Is that an Apple monitor or someone else's? I am sorry, I've never seen a monitor apart from Thunderbolt display. Is that the one?


Sharp is a pretty big Japanese company. They rarely make own-brand consumer stuff for markets outside Japan though, which is why you haven't heard of them. http://en.wikipedia.org/wiki/Sharp_Corporation


It is made by Sharp, not Apple. Apple only pairs it with the Mac Pro; otherwise they only sell their own monitors.


I have the Dell UP2414Q, driving it with Debian Linux. Software support is still not quite there (e.g. in Chrome on Linux), but the display quality is awesome.

I’d buy this monitor again in an instant. Reading/writing/programming/just using your computer does feel a lot better than on “non-retina” panels.


What software problems are you having?


Here's an old thread that somewhat addresses your question (4K monitor + rMBP) -- https://news.ycombinator.com/item?id=7022050

I tried this configuration out a few months ago (new rMBP, Sharp 4K, OS 10.9.1) at an Apple store and it was not satisfactory; there were color fringes around some fonts. But this may have changed.

Of course, 4K + Mac Pro is a supported configuration, and in my experience, it looks great.


I have the 39" Seiki, and I'm very happy with it, especially for the price, though it required some fiddling to make the colors look decent. I just like being able to have a lot on the screen at once, and I use it mainly for programming, so the 30hz issue doesn't bother me much.


Has anyone tried the 39" Seiki on a Mac? At $389 seems like an easier purchase. Plus it doubles as a TV.

http://www.newegg.com/Product/Product.aspx?Item=N82E16889522...


Two people at work have them. I found it just slightly too small when watching them use a 4x 1920x1080 setup with DiplsayFusion.

I decided to try a 50" TCL 4k display. I find I don't mind the 30Hz refresh rate as I don't game on it. However I do find it just a tad too big.

I believe the sweet spot is somewhere around 44" or 45" for 4k-tv-as-a-monitor but absolutely no one is making one in that range. And if I could just get that curved, I think it would be perfect.


I have it but on Windows with no scaling. I paid around $500. 39" is a bit large for a single monitor, in my opinion, since looking towards the corners or edges becomes such a long look. 30hz for the 4K is annoying but livable.


Coincidentally, I was researching this model last night, and the general commentary of this device as a computer monitor is that it's bad (due to the slower frame rate with HDMI at the native resolution).


We have ~20 of them. They're fine. They 30Hz is a little bit annoying if you drag windows around, but it's not really a problem for scrolling.


By all means wait, but as a programmer, I've found the Samsung 28" UHD monitor terrific. I can get a huge amount of crisp text on the screen at once, and the color accuracy is Good Enough For Me. I have not seen any issues driving the display over DisplayPort 1.2 @ 60Hz - BiOS etc all works fine.


Has anyone tried any of the 4k (or Ultra High Definition) displays with Ubuntu? I'd be interested to know what the results are..


I hate staring at a blurry/pixelated monitor more than I dislike the viewing angle of this TN panel. The white balance and viewing angle are the problem here - the colours are pretty good, almost as rich as this Retina Macbook. I have the Samsung UD590.

For coding it's perfect really and I've had no problems using display port at 60hz.


I agree with their comments on the Dell U2713HM, I've had one for about a year now and it's very good.


Indeed, it has just the right pixel density. A 24" 4k display needs DPI scaling to be usable, thus reducing the "screen estate" advantage and >30" 4k displays aren't yet available with all the necessary features.


144hz or bust. No way I'm going back to 60hz.

Quad link DVI is the future.


Is there a noticeable difference between 60hz and 144hz for things like reading text and programming? How much?


No; Yes; Not much.


"30 fps flicker" - that's not a CRT, it doesn't flicker, because unlike CRTs, a flat display doesn't have a ray refreshing the display top to bottom.

And 30 fps on a CRT wouldn't be enough to produce stable image in the retina at all, which is why CRT TVs used to refresh at a minimum of 50Hz, and that still flickers (it stops "flickering" at around 75Hz).

With flat displays frame rate is about display responsiveness and animation detail.

So I wonder where are the claims about eye-strain coming from. Maybe because people just want better display responsiveness and they like to borrow from the problems low refresh CRTs used to cause, because it sounds scientific and legit as a problem.


Persistence of vision depends on the ambient brightness. Movies are 24FPS but they're shown in a dark room so the jitter isn't as bad.

A 30FPS screen in similar conditions wouldn't be as bad, but most people use computer screens in well it rooms.

The flickering on an LCD is different, but presents itself as a serious sputtering when you're moving things around.


Film is projected with a shutter operating at twice the frame rate or better, to minimize flicker. It would be unwatchable otherwise. See http://en.wikipedia.org/wiki/Movie_projector#Shutter


Well ambient brightness matters, but actually analog cinema projectors use a frame rate of 48 fps (they "flick" every movie frame twice), so it's still way above 30fps.

Plus, with CRTs the problem is exacerbated, because they scan top to bottom, and need higher refresh rate to hide this effect.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: