Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> That was my point though, to highlight how relative differences in percentages represent vastly different actual performance jumps. It quickly becomes meaningless since it's not the percentages that matter, it is the actual number of transistors.

But it is the percentage that matters? If I have 10B transistors and I add 1B to it, the speedup I can expect is 10%. If I have 1B transistors and add 1B to it, the speedup I can expect is 100%. Tremendous difference. Why would I ever care about the absolute number of transistors added?



>But it is the percentage that matters? If I have 10B transistors and I add 1B to it, the speedup I can expect is 10%. If I have 1B transistors and add 1B to it, the speedup I can expect is 100%. Tremendous difference. Why would I ever care about the absolute number of transistors added?

Because it's the transistors that actually matter for performance.

If a unit of work (number of files compiled, number of triangles generated or whatever else) takes 1 billion transistors to complete in 1 second, you have gained the same amount of work per second by adding 10% to the latter as you gained by adding 100% to the former.

How much performance you need to feel a difference in a given workload is a separate point, and note that usually the workload changes with the hardware. Compiling the linux kernel in 2025 is a different workload than it was in 2005, for example, and running quake 3 is a different workload than cyberpunk 2077.

If you play a modern game, you don't notice a difference between a 10 year old GPU and a 12 year old GPU - even though one might be twice as fast, they might both be in the single digits of FPS wich feels equally useless.

So we gain more from the hardware than we used to, but since the software is doing more work we're not noticing it as much.


I legitimately just do not see the utility of framing the topic this way.

Benchmarking efforts usually involve taking the same amount or type of work, and comparing the runtime durations or throughputs in turn for different hardware.

Rasterization performance benchmarks for the 5090 revealed exactly the same +20% difference we see in transistor count. This is why I do not see the utility in remarking that in absolute terms we're adding more transistors than ever, because this is basically never what matters in practice. I have a set workload and I want it to go some amount faster.

Software sprawl is an issue no doubt, but that on its own is a separate discussion. It bears light relation with the absolute vs. relative differences discussion we're having here.

> How much performance you need to feel a difference in a given workload is a separate point

It was exactly the point I said at the start should be the point of focus. Maybe we're talking past one another, I don't know.


>It was exactly the point I said at the start should be the point of focus. Maybe we're talking past one another, I don't know.

I think we do - we agree that the symptom is that we don't experience the same gains now as we used to, and that is a problem.

My issue is the notion that this is caused by a slowdown in performance gains from the hardware side, when this is clearly not the case. A common complaint is along the lines of "we only got 30% when last time we got 50%", which completely ignores that the latter 30% is way more actual new performance than the previous 50%.

>I legitimately just do not see the utility of framing the topic this way.

IMO it's always useful to identify the actual reason for a problem and think about the fundamentals.

If the problem is that we're not experiencing the performance gains, we should be asking ourselves "Why does software feel slower today despite the hardware having 10x more performance".

Instead we complain about the hardware for not managing to add the equivalent of all previous performance gains every 2 years, because Moore's law observed that it did so in the beginning of the chessboard (so to speak).

Instead of wondering whether Moore's law is dying or not, we should question why Wirth's law seems to be immortal! ;)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: