I am wondering why anyone at Intel/AMD/Asus/Dell and Microsoft etc are not realizing that there is low hanging fruit to be had in battery life department on current x86 hardware.
The reviewer makes a decent effort with his more limited capability than PC industry giants to discover that Windows is boosting the single core to its max TDP for single core browsing workloads resulting in very high power consumption. Basically scheduling strategy from desktops and servers used as-is on laptops! And easily fixable in software!
SoC power consumption is the remaining bottleneck - that's not as easy but still if CPU/SoC/BIOS vendors and OEMs came together it would not be all that harder to fix that.
These two fixes would put x86 into a very competitive place in laptop battery life department - it's not ARM vs x86, it's the integration pieces. Mind boggling that nothing is being done by PC makers.
It's not that surprising. Have you ever been in a meeting with members from multiple companies? You can hear a pin drop, they are all afraid of spilling secrets or just deferring to management (which is usually also in the room since those tend to be high profile meetings). The only way to get this done is to do it in-house like Apple does. I think MS has sincerely pushed on this with the Surfaces but I don't think they control enough of the vertical below chipset level to match Apple.
Basically Intel and MS would have to merge and on a deep cultural level. That's what Apple has.
The "we own the hardware and software" is a HUGE HUGE HUGE advantage for Apple. I don't know how you get to the battery life and basic stability levels of a Mac without that integration.
It's a similar situation to Android vs. iPhones in terms of device integration.
Although iPhones don't really get much better battery life than the best Android devices around, if you compare battery life/perf to the battery size, the amount of battery life they extract out of (comparably) small batteries is extraordinary. The biggest/best Samsung phone is 5000mAh, whereas the biggest/best iPhone is 4300mAh.
You just throw away efficiency at a system level when you design a phone based on another company's design (i.e., Qualcomm, usually, for Samsung). Qualcomm doesn't have appropriate incentives to make a device as integrated/efficient as Samsung/Apple do.
> it's not ARM vs x86, it's the integration pieces. Mind boggling that nothing is being done by PC makers.
Intel spent over a decade and billions of dollars to make a mobile x86 chip that could compete with ARM on battery life and they failed. And now they're getting taken to the cleaners by an ARM chip for laptops. I don't think it's the integration that's the problem, x86 is just too inefficient relative to ARM.
When it's more efficient to emulate x86 on an ARM chip than to use a native x86 processor you have to ask some questions about the ISA and design.
Like, in what way and how much inefficient in 2022? Would it still be that when Intel finally gets to 5nm/3nm? Is the video reviewer lying or mistaken when he says 2 biggest differences that make up for most of the power draw between x86 and ARM are not in the ISA?
> Is the video reviewer lying or mistaken when he says 2 biggest differences that make up for most of the power draw between x86 and ARM are not in the ISA?
Naive is a better word. The question is really, why don't designers (who have a vested interest in making a feature like battery life better to improve sales) take these "obvious" actions to improve their products? The answer is usually because they're not sound optimizations, or at least come with some heavy tradeoffs.
That's not at all a logical/technical argument - it's just further lazy speculation to claim that only reason designers don't take these obvious actions to improve their product is because "x86 ISA sucks". As pointed out it is hard enough to get obvious things done when there is one company and 3 org units involved - let alone a CPU vendor, an OEM, a BIOS vendor and SoC with 10 other chips by different OEMs. Oh and after that the OS vendor.
None of those are ISA (https://en.wikipedia.org/wiki/Instruction_set_architecture) issues - they are firmware/os/SoC issues. The other issue is manufacturing process - AMD is getting to 5nm only recently and the results in lower power consumption are already telling.
IOW ARM doesn't have some magical ISA level stuff to do more stuff with less power or avoid doing stuff entirely that x86 couldn't replicate.
Straight-line performance per watt is not the problem IMO. The problem is something along the lines of sleep/wake states and how MS + Intel can’t seem to get it together to handle the times when the CPU doesn’t need to be working.
I think the root issue on sleep is Microsoft/vendor induced - this issue only really began when Microsoft introduced connected/modern standby, and started to incite OEMs to remove BIOS/UEFI support for S3 sleep.
It seems unfathomable - maybe there is a valid reason for modern standby, but I've yet to find it. S3 resume is quick on a good laptop with good firmware. Modern sleep seems to drain the laptop battery quickly for some reason, and yet this has been going on for years now.
Maybe standby drain isn't part of the Windows logo requirements and ought to be?
From what I've heard this is mostly on board designers and the complexity of how x86 handles the multiple levels of sleep/power state. Taking a conservative approach is done to prevent bricking devices on wake.
I wouldn't blame MS for this. Linux has the same, if not worse, state.
> it's not ARM vs x86, it's the integration pieces
Eh, I'm not sure how true that is. There are no doubt improvements that could be made, but, well, compare battery life on an ARM Mac to a related x86 Mac; it is night and day.
I am not sure if you watched the video - but the improvements in the 2 areas that he points out (stop single core boosting to tdp and turn off more unused SoC I/O components on idle) already gets x86 very close to M1 power consumption.
> but, well, compare battery life on an ARM Mac to a related x86 Mac; it is night and day.
If I turn the screen all the way down so nothing can be seen, I can peg the processor constantly at 100% of my Intel Mac and still get 10 hours of battery life. The only caveat is that it's a 2010 Core 2 Duo, and what it does in 10 hours an M1 can probably do in 10 minutes.
Mac was already beating Windows machines on x86, when it came to battery life (especially when you consider battery size, too—the actually efficiency was much higher). I dunno how much they beat them on claimed battery life (because IME Apple is way, way more honest about how long their laptops run on battery, than PC vendors are) but real-world it was like 1.5-2x, for a same-sized battery on similarly-powerful hardware.
Their iOS devices were also wildly more efficient than Android devices for years (may still be—I haven't had much exposure to Android devices in three or four years) despite both being on the same processor architecture, so it wasn't the architecture per se making them more efficient.
Apple just seems to be the only major vendor that cares much about power efficiency at a software level. There's absolutely a ton of room for improvement on x86—Apple proved it when they were still on that architecture.
What the OS needs is a way to tell CPU-hungry applications no. One misbehaving app (looking at you, Teams) is enough to pull the CPU to max power consumption and wreck the battery life.
Just keeping the clocks low and letting banner ads rotate a little more slowly would make for much happier users. Even if I set the priority of an app to minimum, the OS still does it’s best to respond to CPU demands
On macOS apps can specify a desired Quality of Service per thread, and so background threads get scheduled to th power-efficient E cores. I don't know if Windows has any similar mechanism, and Posix threads certainly don't.
> Mind boggling that nothing is being done by PC makers.
Correction, nothing being done by Microsoft, they own Windows, they own the software, yet they decided to go full bloatware, for Microsoft, the people who complain are "trolls"
And this video is the result, mediocrity
For some reason, Microsoft always dodge criticism and responsibility, that's what's mind boggling
They only dodge it to those who still loyally use Windows. Those of use who 'escaped' that hellscape are fully aware what an utter pile of bloated crap it's become. No idea why anyone tries to defend it at this point, it's a shockingly poor operating system and stands out as easily the worst product (be it software or hardware) that Microsoft currently offers.
This is a weird take IMO. I use PC and Mac daily (primarily Mac). Windows works fine in my view. There is really nothing that impacts my day to day work other than some great third-party software that is missing on the PC side.
The reviewer makes a decent effort with his more limited capability than PC industry giants to discover that Windows is boosting the single core to its max TDP for single core browsing workloads resulting in very high power consumption. Basically scheduling strategy from desktops and servers used as-is on laptops! And easily fixable in software!
SoC power consumption is the remaining bottleneck - that's not as easy but still if CPU/SoC/BIOS vendors and OEMs came together it would not be all that harder to fix that.
These two fixes would put x86 into a very competitive place in laptop battery life department - it's not ARM vs x86, it's the integration pieces. Mind boggling that nothing is being done by PC makers.