Since last year when spotting a major Linux kernel power regression and subsequently finding the cause of the power problem that affected a large number of mobile Linux users, plus other regressions, it's been fun to look at the Linux power performance situation. How though is the latest Ubuntu Linux code performing when it comes to power efficiency? Here are some early tests of Ubuntu 12.10.
I'm not sure if I'm right, but wouldn't this be interesting to make the consumption benchmark with a different point of view ?
Here, you are getting the consumption average during the test, not the power consumed.
Couldn't be interesting to measure for example the total power needed for :
- Encoding one mp3 file
- Compile something
- Play the demo scene of xonotic
- Play the demo scene of OpenArena
and so forth.
What I understand it that actually you are measuring the consumption at t time, not what is consumed. I don't care that my desktop is consuming more if it do it quicker and so consume less at the end.
Power consumption is such a multi-horned beast. It's good that you are able to maintain the same test system throughout, but pinning down the wasteful components can be a challenge. I have a ProBook with Sandy Bridge graphics, and I was able to lower consumption with some additional tweaking. By adding some extra commands at boot and then running PowerTop, I was able to drop consumption by a few watts, which is a rather sizable improvement. Interestingly enough, I am also able to run Snow Leopard on it, and that yields 3 hours of battery life, while a stock Ubuntu 12.04 gets about 4 hours, and a "tweaked" 12.04 gets 4:30. I don't know what Windows 7 would do, as I have never tried it.
Everything I read about power optimization tweaking in Ubuntu always warns of potential issues, so perhaps these additional gains are avoided for the sake of stability?
I just realized this but I think watts-per-FPS would be more useful data than FPS-per-watt. The reason is with watts-per-FPS, there's a absolute minimum value (0), which would be considered perfect efficiency. The lower the number, the more efficient the test is. With FPS-per-watt, there is (realistically) no maximum or minimum value, so while the numbers are perfectly fine for comparisons, they're actually arbitrary because they can go infinitely high or infinitely low.
I guess in the end it doesn't really matter, I just personally prefer numbers that have a definite beginning and/or end (whether its impossible to reach or not).
Now the reason why these benchmarks are pointless, in addition to what was stated above, is that when you're stressing the machine, ALL POWER MANAGEMENT GOES OUT THE WINDOW.
You aren't testing power management. You're testing the system's MAXIMUM POWER CONSUMPTION (hardware).
That's a good point. A good mobile benchmark is a light web browsing script that runs until the battery dies. It's a common usage scenario that would show how effective power management is for each kernel. PowerTop is good for estimating power usage, but the real value is in more battery life.