Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You seem to be completely confused about why absolute vs relative matters. Moore's law literally states: "[Moore's law] is the observation that the number of transistors in an integrated circuit (IC) doubles about every two years".

This is literally a relative measurement. You cannot reason about Moore's Law in absolute changes.

The other poster has laid it out for you in the simplest terms: a 1 billion transistor increase could mean anything. It could be a 1000% improvement - which is absolutely humongous - or a 10% improvement, which is basically irrelevant. If you want to measure how impactful an increase is, you have to look at relative change. 1 billion transistors on it's own means nothing. It is only interesting with respect to the number of transistors in the previous generation - which is a relative change measurement.

Say we are at Generation 1 with 100 billion transistors. By your reasoning, if we add 1 more billion of transistors to this, that's big. 1 billion transistors are a lot. But this is absolutely incorrect! Because we started out with 100 billion transistors, the change is actually irrelevant.



>You seem to be completely confused about why absolute vs relative matters.

[..]

>This is literally a relative measurement. You cannot reason about Moore's Law in absolute changes.

This is exactly my point. I said exactly the same thing as you: "I think Moore’s law should be avoided altogether when discussing progress in this area"!

Performance is an absolute thing, not a relative thing. Amount of work per second, not percentage-over-previous!

Doubling means that each step encompasses the sum of all previous steps. Every step gives vastly more performance than any preceding step.

If a generation gives 60% more performance and all previous generations gave 100%, the latest jump will still be the one that added the most performance.

I think this is often overlooked and worth mentioning. People are actually complaining about performance jumps that are the biggest performance jumps ever seen because they are thinking in percentages. It makes no sense.


I disagree on this: "Performance is an absolute thing"

I think the core misunderstanding is what is being discussed. You say "Performance is an absolute thing, not a relative thing. Amount of work per second, not percentage-over-previous!"

But nobody here is concerned with the absolute performance number. Absolute performance is only about what we can do right now with what we have. The discussion is about the performance improvements and how they evolve over time.

Performance improvements are inherently a relative metric. The entire industry, society and economy - everything is built on geometric growth expectations. You cannot then go back to the root metric and say "ah but what really matters is that we can now do X TFlops more", when that number is a mere 10% improvement, nobody will care.

Built into everything around computing is the potential for geometric growth. Startups, AI, video games - it's all predicated on the expectation that our capabilities will grow in geometric fashion. The only way to benchmark geometric growth is by using % improvements. Again, it is wholly irrelevant to everyone that the 5090 can do 20 tflops more than the 4090. 20 TFlops is some number, but whether it is a big improvement or not - ie is it expanding our capabilities at the expected geometric rate or not - is not possible to divine from this number.

And when we look it up, it turns up that we went from 80 tflops to 100 tflops, which is a paltry 25% increase, well short of the expectations of performance growth from Nvidia which was in the 50% region.

20 tflops might or might not be a lot. The only way to judge is to compare how much of an improvement it is relative to what we could have before.


>I think the core misunderstanding is what is being discussed.

I see your point and it's valid for sure, and certainly the prevailing one, but I think it's worth thinking about the absolute values from time to time even in this context.

In other contexts it's more obvious, let's use video. Going from full HD to 4K is roughly a doubling of resolution in each direction. It means going from about 2 megapixels to 8, a difference of 6.

The next step up is 8k, which is going from 8 megapixels to 33. A difference of 25.

This is a huge jump, and crosses a relevant threshold - many cameras can't even capture 33mpx, and many projectors can't display them.

If you want to capture 8k video, it's not relevant if your camera has twice the megapixel count of the previous one - it needs to have 33mpx. If it has 16 instead of 8 it doesn't matter (also it doesn't really matter if it has 100 instead of 40).

On the other hand, if you want to capture 4k, you need only 8 megapixels. If you have 12, 16 24 or 40 doesn't really matter.

If we return to CPUs, absolute performance matters there too - if you want to play back video without frame drops, you need a certain absolute performance. If your computer is twice as fast and still can't do it, the doubling didn't matter. Similarly if your current computer can do it, doubling it again won't be noticeable.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: