Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>I think the core misunderstanding is what is being discussed.

I see your point and it's valid for sure, and certainly the prevailing one, but I think it's worth thinking about the absolute values from time to time even in this context.

In other contexts it's more obvious, let's use video. Going from full HD to 4K is roughly a doubling of resolution in each direction. It means going from about 2 megapixels to 8, a difference of 6.

The next step up is 8k, which is going from 8 megapixels to 33. A difference of 25.

This is a huge jump, and crosses a relevant threshold - many cameras can't even capture 33mpx, and many projectors can't display them.

If you want to capture 8k video, it's not relevant if your camera has twice the megapixel count of the previous one - it needs to have 33mpx. If it has 16 instead of 8 it doesn't matter (also it doesn't really matter if it has 100 instead of 40).

On the other hand, if you want to capture 4k, you need only 8 megapixels. If you have 12, 16 24 or 40 doesn't really matter.

If we return to CPUs, absolute performance matters there too - if you want to play back video without frame drops, you need a certain absolute performance. If your computer is twice as fast and still can't do it, the doubling didn't matter. Similarly if your current computer can do it, doubling it again won't be noticeable.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: