Hacker News new | past | comments | ask | show | jobs | submit login

The computer monitor market is at the mercy of gamers who want 1080p-1440p if it means getting 600fps in a game rather than 150fps at higher resolutions. It's why since 2015 there has been an arms race of refresh rates rather than of resolution. We had 4-5k panels at retina in 2015, market conditions could have had us with much higher quality panels if the entire thing wasn't catered to young, cash-strapped gamers whose opportunity cost when putting money towards a monitor is to put money towards a GPU.

Thankfully, as more and more public figures begin using OLED TVs instead of computer monitors, and as the monitor market takes after the larger consumer TV market, we will start to see better technologies compete with each other like OLED and FALD/miniLED.

It's a horrible time to buy a monitor right now, if you could wait even a year you should do so.

But keep in mind the relativity of "retina". It depends just as much on viewing distance as it does PPI. There are several handy charts you can view online relating the minimum noticeable viewing distance based on PPI.




Whilst true, it's more accurate to say the monitor market is hostage to the economics of LED panel manufacturing.

LEDs are normally produced as very large panels on production lines where every panel is made to a given PPI (and upper bound on the refresh rate).

Of course the technology mix is the technology mix, but the industry is built around reasonably large investments in these production lines, amortized across the entire screen industry.

A 27 inch 1080p monitor and a 24 inc 1080p monitor are produced on distinctly different production lines as it were, as they have a different PPI. But a tiny screen with the same PPI as said 27 inch 1080p likely came from the same factory in Korea.

Apple's panels are actually produced under contract by LG in Apple specific production lines, hence they can obtain exotic PPIs. Because they promise to use the line's capacity for multiple years.


As some of them have found (e.g. Linus window snapping burn in, in under six months), OLEDs have real drawbacks for use as standard desktop use displays. It's why even Apple don't use OLEDs on their Macs yet, who you would expect to do so if it was just a matter of "more money = more premium" with no deference to the gaming use case.


Agreed re: the medium-term issues of OLED.

But I can get a $1000 4k, 120hz OLED screen. Sure, it might be 42-48", but I could just as easily buy a deeper desk, wall mount it 4-5 feet away for the same effect. I'm currently sitting 3.5 feet from a 24" monitor. In this instance, the TV becomes "retina" past 3 feet.

https://www.designcompaniesranked.com/resources/is-this-reti...

Let's assume when used as a monitor without excessive amounts of care, irreparable burn-in takes 2 years. I could buy two of these TVs and wait out the whole nonsense the market is going through right now. Re: Linus, Keep in mind that window snap burn in was fixed with the pixel refresh function. And the conclusion he and Wendell came to in that video was that it isn't perfect, and it's often wasteful, but you can just get another monitor.


For many many people, 2 years is not a reasonable life expectancy for a $1000 monitor. My current home monitors are 1 (€400 27" 1440p165 IPS), 4 (€500 27" 1440p144 IPS) and 7 (€250 21" 1080p60 IPS) years old


Ditto this. My home monitors are:

27" Ultrafine 5K - bought in 2018, so 3+ years old, $1000+. 27" LG 4K - bought in 2019, so 2+ years old, $600+. 27" Dell Ultrasharp 1440p - bought in 2016, $800+. 24" Dell Ultrasharp 1920x1200 - bought in 2013, $400+.

All of these are in active use still - even the 8 year old 24" is still a solid AH-IPS monitor with decent wide-gamut colour support and LED backlighting.

I'd be very disappointed in a $1k display that was showing irreparable issues after 2 years. I'd be positively angry at a $1k display with issues after 6 months.


Although you make an ok point, you're greatly exaggerating FPS. I'm on a beautiful 1440p 144hz LG and I really only need 150fps. This still often requires less-than-ultra video settings even with a 3080 ti. There is a very small niche of competitive gamers who want 240hz at 1080p and the fps that justifies it.



BlurBusters is amazing resource to investigate HFR.


there are 360Hz monitors, now, and they are likely to sell well as soon as the silicon shortage eases, if they aren't already.

the real problem is that a 1080p monitor at 360Hz requires six times the bandwidth of a standard 1080p60 monitor, while a 120Hz 4k UHD monitor requires eight times the bandwidth of a 1080p60 monitor.

it's easier to reach the FPS targets incrementally than it is to reach the resolution targets in larger jumps.


Isn't this the entire esports community though?

edit: This was intended to be a serious question. I thought the esports folks were mostly interested with minimizing input latency and maximizing framerate, to the exclusion of most other concerns. ESPN covers esports tournaments now, sometimes on the front page of espn.com, so I thought it was more popular than perhaps it is.


Being a small niche really not mutually exclusive with the eSports community, unless you extend the eSports community to everyone who has ever played LoL or CS:GO. I think more people max out settings on those games than minimise them for the extra FPS still.


The driver is really what manufacturers want to sell. We saw this with those awful 13xx by 768 screens no one wanted. They made loads of them and then had to sell them.

What I suspect is really going on is it's easier to overclock screens than it is to increase yield on high resolutions.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: