Originally posted by Ericg
View Post
Announcement
Collapse
No announcement yet.
The Wayland Situation: Facts About X vs. Wayland
Collapse
X
-
Originally posted by sireangelus View Posti know that. I also noticed a big increase in battery life simply by using gnome classic over unity. In fact i'm running xubuntu right now while i'm writing.All opinions are my own not those of my employer if you know who they are.
Comment
-
ok. so i used linux recently on this set of hardware:
A desktop with core2duo, hd4850
a thinkpad x61, core2duo 2ghz, intel x3100
Acer Aspire one A110(original n270 atom)
the aforementioned samsung.
Small take: on the desktop, with proprietary drivers, disabling the effects has the ability to silence the fan of the videocard.
on the thinkpad, it was a 10-20? difference in heat( i dropped it and it cause some problems with the cooling)
Atom: faster system overall, 40 min with the original battery, 1h more with the extended(so it's like, 1,40 to 2,30h and 3.30h to 4,40)
on the samsung, i've already wrote it.
it seems to me it's a little less than one data point. And i must say also that i'm used to build and load my own kernels... and that squeezes a little bit more of performance and battery life out of them.
Comment
-
ok. so i used linux recently on this set of hardware:
A desktop with core2duo, hd4850
a thinkpad x61, core2duo 2ghz, intel x3100
Acer Aspire one A110(original n270 atom)
the aforementioned samsung.
Small take: on the desktop, with proprietary drivers, disabling the effects has the ability to silence the fan of the videocard. The frequencies where checked and even force down with a custom bios. Such problem presented both with open source in low power mode and fxglxr
on the thinkpad, it was a 10-20? difference in heat( i dropped it and it cause some problems with the cooling)
Atom: faster system overall, 40 min with the original battery, 1h more with the extended(so it's like, 1,40 to 2,30h and 3.30h to 4,40)
on the samsung, i've already wrote it.
it seems to me it's a little less than one data point. And i must say also that i'm used to build and load my own kernels... and that squeezes a little bit more of performance and battery life out of them.
Do wayland do anything to reduce cpu/gpu usage in a way that makes easier for the driver to go to sleep faster and more often?
---------------------inserting-line-for-duplicated-post-warning----------------------Last edited by sireangelus; 07 June 2013, 02:55 PM.
Comment
-
I have a question about multiple monitors and smooth playback of videos (and games).
I have a setup with the primary monitor running at 75hz and the second monitor (my TV) running at 24, 50 or 60hz. On nvidia i get smooth playback of videos on the TV while the TV frequency differs from the frequency of the primary monitor with vdpau and the video overlay. As soon as an older compositing manager is in use, i'll get tearing on the second monitor. With new compositors that support GLX_EXT_buffer_age i'll get a really bad jitter (dropped and duplicate frames) on both monitor, but it's tearfree, so playback of videos and games are not smooth on either monitors.
On Windows 7 playback is smooth on the primary monitor but not on the second as long as compositing is active.
How will this work on wayland?
Comment
-
-
-
Originally posted by sireangelus View Postok. so i used linux recently on this set of hardware:
A desktop with core2duo, hd4850
a thinkpad x61, core2duo 2ghz, intel x3100
Acer Aspire one A110(original n270 atom)
the aforementioned samsung.
Small take: on the desktop, with proprietary drivers, disabling the effects has the ability to silence the fan of the videocard. The frequencies where checked and even force down with a custom bios. Such problem presented both with open source in low power mode and fxglxr
on the thinkpad, it was a 10-20? difference in heat( i dropped it and it cause some problems with the cooling)
Atom: faster system overall, 40 min with the original battery, 1h more with the extended(so it's like, 1,40 to 2,30h and 3.30h to 4,40)
on the samsung, i've already wrote it.
it seems to me it's a little less than one data point. And i must say also that i'm used to build and load my own kernels... and that squeezes a little bit more of performance and battery life out of them.
Do wayland do anything to reduce cpu/gpu usage in a way that makes easier for the driver to go to sleep faster and more often?
That being said you bring up a large variety of points...
1) You're using the radeon driver. Even in low-power mode the radeon driver still consumes more power than a low-power FGLRX.
2) You mention you're using an Intel CPU, but haven't mentioned what kernel you are using or if you are using Intel's new thermald. Between a 3.10 kernel using Intel's new P-State driver, rather than OnDemand, and enabling Thermald I too saw a 10-15degree drop in temperature in just ONE reboot on my Sandy Bridge Ultrabook.
Using the proprietary driver in a laptop is ALWAYS a good idea, where possible, if you care about temperature and battery life simply because no open source driver other than Intel actually has automatic power management capabilitiesAll opinions are my own not those of my employer if you know who they are.
Comment
-
A) Media Coherence. Whats Media Coherence? In its simplest terms... Your browser window? That's a window. Your flash player window on youtube? The flash player itself, displaying the video, is a sub-window. What keeps them in sync? Absolutely nothing. The events are handled separately and right now you just pray that they don't get processed too far apart. Which is why when you scroll on Youtube ,or other video sites with a video playing, sometimes everything tears and chunks.
Comment
Comment