Wow, how crazy is it that every scientific study by Microsoft happens to prove the superiority of their products! What a lucky coincidence, since we all know they would never fudge the results.
To be fair, the study's authors have made it clear in the study's pdf that it was commissioned on Microsoft's request and other requests that MS made and how the test beds were running. It looks fairly neutral and can be replicated.
They do not make recommendations or conclusions for MS, it is MS that interprets study's results. If looked at power draw, the difference between browsers is small, but IE has an advantage albeit small. Also, the study was done by Kurt Roth of MIT-Fraunhofer Center for Sustainable Energy and looking at the pdf, it shows.
I have no doubt the study is legit and nothing is 'fudged' beyond their own presentation of the graphs and other things their marketing department have done to promote the study.
The obvious truth is that Microsoft will only commission a study on something they've already proven to themselves internally. As will every other company on the planet.
According to top, Chrome is behaving very well here. CPU load is below 10% as I type and memory footprint, although not negligible, is well within what I would consider reasonable (under 1 GB for 4 Gmail tabs and a couple less clever sites). Pegging two cores seems excessive. What are you running?
Interesting... maybe it's something specific to Chrome on Windows or, possibly, a plugin. I'm running it both at home and at work and it stays happy even after a week on Esser machines.
That's actually a great analogy, because Microsoft and an onion share a lot of similarities: layers and layers of bureaucracy, products that taste sour and marketing that makes your eyes sting.
There is at least a slight chance that the non-zero baseline represents power consumption at the desktop with no browser running, so that the heights on the chart depict just the power draw attributable to the browser (it may also be a baseline relative to showing about:blank). 15W+ even for simple stuff like Wikipedia seems pretty high to have the base load already subtracted out, unless this is just a measurement of the peak power draw during rendering (in which case we need to normalize by time to get Joules instead of Watts).
EDIT: reading the PDF, this seems to be the case: they claim an at-desktop baseline of 14.7W average for the six notebooks they tested. Actual baselines for the notebooks tested varied from 9.76W to 22.65W, and the most strenuous test adds 3.11W-6.92W depending on the machine (and oddly enough, it wasn't always YouTube that had the highest power draw - on one machine, it was Bing, if the data is to be believed).
Hah!
Great find. When I took a technical communication course, we explicitly talked about the ethical implications of misleading the audience by manipulating information and the perception of information.
Needless to say, the authors did a poor job of following standard ethical practices when presenting information to support their claim.
How is that misleading Y-axes? The intervals are same, only thing that will be different is the resolution, and going into that fine resolution amplifies the difference. I will not say it as misleading in a slightest bit.
The Y-axis should've started at 0 watts, not at (a rather arbitrary) 18 watts.
Look at the "Flash Video" and "HTML5 Video" bars in the original chart. The blue IE bars are roughly half the size of the green Chrome bars, which might lead you to believe that IE only consumes roughly half of the energy that Chrome consumes in those tests. But as the numbers show, the difference isn't nearly that great.
Starting the Y-axis at 18 watts only serves to make IE's advantage over the other browsers look greater than it actually is.
The y-axis should have kept the baselines that the study authors used in their graphs (14.7W for laptops, 37.8W for desktops), since those starting points are what gives you the power draw due to the browser. What the Microsoft blog did by moving the baselines up from there was unethical, but having the chart show power consumption relative to idle instead of powered down is perfectly reasonable.
Yep. IE may be more efficient than using a different browser on Windows, but it can't hold a candle to most Android and iOS devices, and there aren't many Windows laptops that can beat Apple's in tests like this.
However, this is mostly caused by poor drivers and firmware, and both Microsoft and Intel have lately been pressuring the rest of the PC ecosystem to clean up their act so that devices like ultrabooks can be viable. The resulting improvements will help other browsers just as much as IE.
Oh really? And what OS is IE mostly used on? Let me answer that for you - Windows. I think there are a lower hanging fruit than browser power optimization that can save on battery life on a windows powered machine.
Anyone got independent verification? What about across a representative sample of sites and apps? As it stands, this seems too easily cooked to be accepted at face value.
And admittedly it is in detail including sites used and test bed.
The difference in power draw upon firing IE, Mozilla and Chrome is small, but it is there. In both, flash and bare bones tests. I do not know how big of a difference it will make on portable computers though.
In other news, the material used to cover seats in a BMW has the highest¹ thread-count of all luxury car models.
____
1. Totally made this up. The point being that nobody chooses a car based on the thread-count of the material in the seats. And nobody chooses a browser based on the power draw.
I agree that this is a metric that will hardly matter to anybody IRL. Maybe when it comes to mobile devices (Surface?) and limited battery life, but I'm not sure how relevant that even is (I wouldn't think that surfing the web draws most power on a Surface).
It's not that hard to optimize for power draw in software if you're willing to do the work to measure and tune. Lots of simple optimizations can produce huge gains, whether it's using the correct timer APIs to allow the OS to coalesce timer events (Raymond Chen wrote a blog post about how this allows the CPU to stay in sleep mode longer on Windows by serving all timers on say, 1000ms intervals instead of 999ms/1000ms/1001ms, etc), intentionally clamping your rendering rate to 30hz instead of 60hz, using hardware vsync instead of trying to do vsync in software (hardware vsync is often much cheaper because it can rely on the GPU clock/interrupts instead of busy-waiting or CPU timers), or simply finding ways to do less unnecessary work.
The interesting thing though is that power draw can run counter to other goals. Getting the best possible framerate can often mean higher power draw, even though a higher framerate usually means your code is running 'faster' - if you go from running on one core to four cores, you are probably increasing the CPU's power draw (and heat dissipation too, which means more power spent running the fans). But maybe if running on four cores lets you finish early enough, you can put the whole app to sleep while you wait for the next vsync, saving power. Ultimately, you have to test this stuff, then tune, then test again.
I expect IE's power savings are mostly due to its architecture. IIRC they do a lot of their isolation via threads instead of processes, which means they can use much cheaper communication mechanisms between their isolation zones, and that's going to add up. I expect their rendering architecture for plugins is also significantly more efficient than NPAPI (though I wonder if it can beat PPAPI), since they have the ability to more or less dictate how things should be done, and ActiveX plugins have been using GDI/DirectX for ages, automatically tapping into hardware surfaces and acceleration. IE9/IE10 also led the way in terms of fast, pervasive hardware acceleration for browser rendering - while Firefox and Chrome had HW accel first in certain scenarios, IE was early to the game when it comes to using hardware to render entire pages.
P.S. People who say power draw doesn't matter in a browser have obviously never used a laptop or a cell phone before. Do you think Apple and Google don't try to optimize power draw on iOS and Android devices?
"In addition, at the request of Microsoft we set the JavaScript timer frequency to “conserve power” in
the Windows power options. We found, however, that the default Javascript time frequency for all
computers tested was set to “maximum performance.” We did not investigate the impact of this setting
upon browser power draw."
So yeah, IE got some optimization applied, and unfairly so, since the other browsers were run with default settings.
No, default power settings on a laptop do that, if it is put in a power saving or Balanced mode. It is not a specific IE option, but Windows OS option.
"IE9/IE10 also led the way in terms of fast, pervasive hardware acceleration for browser rendering - while Firefox and Chrome had HW accel first in certain scenarios, IE was early to the game when it comes to using hardware to render entire pages."
That would explain why the power draw for other browsers was fluctuating depending upon websites surfed and IE seemed kind of consistent.
This is a very important benchmark, provided framerates and rendering quality are held mostly the same. It could easily grow to include cases like having 100 tabs open, which currently don't really appear in any benchmarks.
For almost all my devices, this is the one I care about most.
Amdahl's law. You can get the browser itself down to nothing and it's totally futile because as soon as you open a page Flash Player will still peg the CPU.
They did test playing videos through both Flash and HTML5, and found that power draw still varied significantly depending on browser, with IE drawing 30-40% less power than Chrome (after subtracting out at-desktop idle power). So there's definitely still low-hanging fruit for the browser itself.
Playing video isn't the thing that causes Flash Player to be a CPU suck. The video playing is the responsibility of the video codec, which itself is probably using the GPU. You can play HD video on a Pentium 4, and if you notice in the tests the CPU consumption for Flash Player playing video wasn't preposterously higher than it was for HTML5, which is obviously uncharacteristic of Flash Player's normal behavior.
The problem with Flash Player is that people create busy loops in ActionScript, especially in ads. You open a dozen tabs, one of them has one of these ads, your CPU is pegged. If you're worried about "browser power consumption" and your first step is not to shoot Flash Player in the head, you're doing it wrong.
If you're running Windows on a laptop, the argument here is that using IE as the browser will make your laptop last longer on a charge than other competing browsers.
The graph linked[1] does not show that much of a difference between different browsers - the difference appears exaggerated because the scale on the Y-axis does not start at zero. Thus, it is very likely that whatever power saving that you will derive are negligible and don't make a difference in the grand scheme of things.
it looks like it was with 3 tabs open. It obviously needs more testing since the difference between three browsers is small. The study can be easily replicated, as the procedure is mentioned in detail in the study's pdf.
I doubt that text-based browsers have been very good about incorporating the latest advancements in JavaScript engines, so they could easily lose there.
Yes, but the article specifically states that the efficiency is gained by by better utilization of PC hardware. I suppose if you're using your laptop exclusively for browsing it might make a difference, but switching to a tablet would give you even more battery efficiency.
Momentum? An organization as large as MS, a project can go on indefinitely because there is a ton of money to back it up. Something as big as IE would probably take the CEO to cancel it.
I think it would be more likely they would use one of the OSS engines out there and rebuild IE on top of a fork.