It means a lot to me, because cutting power consumption in half for millions of devices means we can turn off power plants (in aggregate). It’s the same as lightbulbs; I’ll never understand why people bragged about how much power they were wasting with incandescents.
>cutting power consumption in half for millions of devices means we can turn off power plants
It is well known that software inefficiency doubles every couple years, that is, the same scenario would take 2x as much compute, given entire software stack (not disembodied algorithm which will indeed be faster).
The extra compute will be spent on a more abstract UI stack or on new features, unless forced by physical constraints (e.g. inefficient batteries of early smartphone), which is not the case at present.
That's weird - if software gets 2x worse every time hardware gets 2x better, why did my laptop in 2010 last 2 hours on battery while the current one lasts 16 doing much more complex tasks for me?
Elsewhere in the comments, it is noted Apple's own estimates are identical despite allegedly 2x better hardware.
Aside, 2 hours is very low even for 2010. There's a strongly usability advantage for going to 16. But going from 16 to 128 won't add as much. The natural course of things is to converge on a decent enough number and 'spend' the rest on more complex software, a lighter laptop etc.
I have dimmable LED strips around my rooms, hidden by cove molding, reflecting off the whole ceiling, which becomes a super diffuse, super bright “light”.
I don’t boast about power use, but they are certainly hungry.
For that I get softly defuse lighting with a max brightness comparable to outdoor clear sky daylight. Working from home, this is so nice for my brain and depression.
First, only CPU power consumption is reduced, not other components, second, I doubt tablets contribute significantly to global power consumption, so I think no power plants will be turned off.