Originally posted by .CME.
View Post
Announcement
Collapse
No announcement yet.
The Wayland Situation: Facts About X vs. Wayland
Collapse
X
-
All opinions are my own not those of my employer if you know who they are.
-
Originally posted by Ericg View PostBeyond cutting down the amount of work the GPU and CPU has to do, and taking advantage of modern hardware capabilities-- such as overlays. I do not believe there is anything fundamentally different in Wayland that would so suddenly make power consumption drop.
That being said you bring up a large variety of points...
1) You're using the radeon driver. Even in low-power mode the radeon driver still consumes more power than a low-power FGLRX.
2) You mention you're using an Intel CPU, but haven't mentioned what kernel you are using or if you are using Intel's new thermald. Between a 3.10 kernel using Intel's new P-State driver, rather than OnDemand, and enabling Thermald I too saw a 10-15degree drop in temperature in just ONE reboot on my Sandy Bridge Ultrabook.
Using the proprietary driver in a laptop is ALWAYS a good idea, where possible, if you care about temperature and battery life simply because no open source driver other than Intel actually has automatic power management capabilities
ok.
1): i was using both, i'm sorry if that wasn't clear.
2): i was using a range of kernels, it's true. on the desktop and thinkpad laptop i was using stock/custom(but same kernel version) ubuntu 12.10.
On the aspire one i was using an old 2.6.38(it's complicated-involves sabayon and ricompiling the whole system with -march=atom)
the samsung uses kernel 3.9.4 vanilla, with bumblebee and nvdia 310.xx . And yes i saw a drop in temp using intel scaling and thermald, a few 4? in the low freq-idle spectrum and some 2?-3? at maximum(kernel build+ various instances of glxspheres up to the point of slowing down, both using the dedicated and integrated gpu. it was a double stress-test stability and maximu temp test to see if i would have problems in the summer.) But the behaviour is consistent across all kernels and distros and systems:using opengl raises the whole power usage of the system.
Comment
-
ok... i have a problem, when i write a post more than a few lines long, it says it needs moderator approval. if i write a shorter one and then edit it i'm fine.
Originally posted by Ericg View PostBeyond cutting down the amount of work the GPU and CPU has to do, and taking advantage of modern hardware capabilities-- such as overlays. I do not believe there is anything fundamentally different in Wayland that would so suddenly make power consumption drop.
That being said you bring up a large variety of points...
1) You're using the radeon driver. Even in low-power mode the radeon driver still consumes more power than a low-power FGLRX.
2) You mention you're using an Intel CPU, but haven't mentioned what kernel you are using or if you are using Intel's new thermald. Between a 3.10 kernel using Intel's new P-State driver, rather than OnDemand, and enabling Thermald I too saw a 10-15degree drop in temperature in just ONE reboot on my Sandy Bridge Ultrabook.
Using the proprietary driver in a laptop is ALWAYS a good idea, where possible, if you care about temperature and battery life simply because no open source driver other than Intel actually has automatic power management capabilities
1): i was using both, i'm sorry if that wasn't clear.
2): i was using a range of kernels, it's true. on the desktop and thinkpad laptop i was using stock/custom(but same kernel version) ubuntu 12.10.
On the aspire one i was using an old 2.6.38(it's complicated-involves sabayon and ricompiling the whole system with -march=atom)
the samsung uses kernel 3.9.4 vanilla, with bumblebee and nvdia 310.xx . And yes i saw a drop in temp using intel scaling and thermald, a few 4? in the low freq-idle spectrum and some 2?-3? at maximum(kernel build+ various instances of glxspheres up to the point of slowing down, both using the dedicated and integrated gpu. it was a double stress-test stability and maximum temp test to see if i would have problems in the summer.) But the behaviour is consistent across all kernels and distros and systems:using opengl raises the whole power usage of the system.
Do you share the view of Martin Gr??lin, that has been on a war against the definition of lightweight?Last edited by sireangelus; 07 June 2013, 03:48 PM.
Comment
-
Originally posted by Ericg View PostThere anyway for you to drop your primary monitor to 60hz, and set your tv to 60 as well, see if you get jittering then? Sounds like a synchronization issue due to the differing refresh rates
Comment
-
Originally posted by sireangelus View Postok... i have a problem, when i write a post more than a few lines long, it says it needs moderator approval. if i write a shorter one and then edit it i'm fine.
1): i was using both, i'm sorry if that wasn't clear.
2): i was using a range of kernels, it's true. on the desktop and thinkpad laptop i was using stock/custom(but same kernel version) ubuntu 12.10.
On the aspire one i was using an old 2.6.38(it's complicated-involves sabayon and ricompiling the whole system with -march=atom)
the samsung uses kernel 3.9.4 vanilla, with bumblebee and nvdia 310.xx . And yes i saw a drop in temp using intel scaling and thermald, a few 4? in the low freq-idle spectrum and some 2?-3? at maximum(kernel build+ various instances of glxspheres up to the point of slowing down, both using the dedicated and integrated gpu. it was a double stress-test stability and maximum temp test to see if i would have problems in the summer.) But the behaviour is consistent across all kernels and distros and systems:using opengl raises the whole power usage of the system.
Do you share the view of Martin Gr??lin, that has been on a war against the definition of lightweight?
Of course there's a war against the definition of lightweight. Lightweight is relative. In comparison to Core X11 even EFL is "Heavyweight." What I consider lightweight and what you consider lightweight are gonna be different.
Did you check powertop's Tunables tab? See if all the powersaving features are correctly enabled, that may save you a few degrees. The other option, and its gonna sound stupid but... did you make sure all the fans and vents were clean? Its one of those stupid, obvious things that even the experienced tinkerers seem to forget sometimes. I'm not saying you're -WRONG- about OpenGL raising the system-temp, but its not something I've experienced. Granted that may be because I'm on Sandy Bridge with no discrete GPU therefore CPU or GPU doesn't matter, the chip has to be woken up regardless.All opinions are my own not those of my employer if you know who they are.
Comment
Comment