Announcement

Collapse
No announcement yet.

The Wayland Situation: Facts About X vs. Wayland

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Ericg View Post
    To be fair.... Unity + Compiz was a complete mess. You'd be a lot better off testing Gnome 3 vs KDE vs E17 for power consumption just because those are better maintained and optimized.
    i know that. I also noticed a big increase in battery life simply by using gnome classic over unity. In fact i'm running xubuntu right now while i'm writing.

    Comment


    • #32
      Originally posted by sireangelus View Post
      i know that. I also noticed a big increase in battery life simply by using gnome classic over unity. In fact i'm running xubuntu right now while i'm writing.
      I saw that, I was just pointing out to you (and other readers) that your one data-point was inherently flawed and not statistically valid. I'm glad you're getting good battery life with XFCE though.

      Comment


      • #33
        ok. so i used linux recently on this set of hardware:

        A desktop with core2duo, hd4850
        a thinkpad x61, core2duo 2ghz, intel x3100
        Acer Aspire one A110(original n270 atom)
        the aforementioned samsung.
        Small take: on the desktop, with proprietary drivers, disabling the effects has the ability to silence the fan of the videocard.
        on the thinkpad, it was a 10-20 difference in heat( i dropped it and it cause some problems with the cooling)
        Atom: faster system overall, 40 min with the original battery, 1h more with the extended(so it's like, 1,40 to 2,30h and 3.30h to 4,40)
        on the samsung, i've already wrote it.

        it seems to me it's a little less than one data point. And i must say also that i'm used to build and load my own kernels... and that squeezes a little bit more of performance and battery life out of them.

        Comment


        • #34
          ok. so i used linux recently on this set of hardware:

          A desktop with core2duo, hd4850
          a thinkpad x61, core2duo 2ghz, intel x3100
          Acer Aspire one A110(original n270 atom)
          the aforementioned samsung.
          Small take: on the desktop, with proprietary drivers, disabling the effects has the ability to silence the fan of the videocard. The frequencies where checked and even force down with a custom bios. Such problem presented both with open source in low power mode and fxglxr
          on the thinkpad, it was a 10-20 difference in heat( i dropped it and it cause some problems with the cooling)
          Atom: faster system overall, 40 min with the original battery, 1h more with the extended(so it's like, 1,40 to 2,30h and 3.30h to 4,40)
          on the samsung, i've already wrote it.

          it seems to me it's a little less than one data point. And i must say also that i'm used to build and load my own kernels... and that squeezes a little bit more of performance and battery life out of them.

          Do wayland do anything to reduce cpu/gpu usage in a way that makes easier for the driver to go to sleep faster and more often?


          ---------------------inserting-line-for-duplicated-post-warning----------------------
          Last edited by sireangelus; 06-07-2013, 02:55 PM.

          Comment


          • #35
            I have a question about multiple monitors and smooth playback of videos (and games).

            I have a setup with the primary monitor running at 75hz and the second monitor (my TV) running at 24, 50 or 60hz. On nvidia i get smooth playback of videos on the TV while the TV frequency differs from the frequency of the primary monitor with vdpau and the video overlay. As soon as an older compositing manager is in use, i'll get tearing on the second monitor. With new compositors that support GLX_EXT_buffer_age i'll get a really bad jitter (dropped and duplicate frames) on both monitor, but it's tearfree, so playback of videos and games are not smooth on either monitors.

            On Windows 7 playback is smooth on the primary monitor but not on the second as long as compositing is active.

            How will this work on wayland?

            Comment


            • #36
              Originally posted by Ericg View Post
              5) I meant to ask Daniel, but I forgot about it. X was an odd number too at 15, I can only assume that they are using the extra bit for something other than actual counting.
              I thought it was simply because they're using signed integers...?

              Comment


              • #37
                Originally posted by dee. View Post
                I thought it was simply because they're using signed integers...?
                Very well could be, again I meant to ask Daniel but forgot about it.

                Comment


                • #38
                  Originally posted by sireangelus View Post
                  ok. so i used linux recently on this set of hardware:

                  A desktop with core2duo, hd4850
                  a thinkpad x61, core2duo 2ghz, intel x3100
                  Acer Aspire one A110(original n270 atom)
                  the aforementioned samsung.
                  Small take: on the desktop, with proprietary drivers, disabling the effects has the ability to silence the fan of the videocard. The frequencies where checked and even force down with a custom bios. Such problem presented both with open source in low power mode and fxglxr
                  on the thinkpad, it was a 10-20 difference in heat( i dropped it and it cause some problems with the cooling)
                  Atom: faster system overall, 40 min with the original battery, 1h more with the extended(so it's like, 1,40 to 2,30h and 3.30h to 4,40)
                  on the samsung, i've already wrote it.

                  it seems to me it's a little less than one data point. And i must say also that i'm used to build and load my own kernels... and that squeezes a little bit more of performance and battery life out of them.

                  Do wayland do anything to reduce cpu/gpu usage in a way that makes easier for the driver to go to sleep faster and more often?
                  Beyond cutting down the amount of work the GPU and CPU has to do, and taking advantage of modern hardware capabilities-- such as overlays. I do not believe there is anything fundamentally different in Wayland that would so suddenly make power consumption drop.

                  That being said you bring up a large variety of points...

                  1) You're using the radeon driver. Even in low-power mode the radeon driver still consumes more power than a low-power FGLRX.
                  2) You mention you're using an Intel CPU, but haven't mentioned what kernel you are using or if you are using Intel's new thermald. Between a 3.10 kernel using Intel's new P-State driver, rather than OnDemand, and enabling Thermald I too saw a 10-15degree drop in temperature in just ONE reboot on my Sandy Bridge Ultrabook.

                  Using the proprietary driver in a laptop is ALWAYS a good idea, where possible, if you care about temperature and battery life simply because no open source driver other than Intel actually has automatic power management capabilities

                  Comment


                  • #39
                    Ericg

                    Question for everyone else: Would you guys be interested in a systemd / SysV / Upstart comparison as well? I was toying around with the idea in my head, hadn't made up my mind yet.
                    I would be interested in this comparison. Thanks for the great article!

                    Comment


                    • #40
                      A) Media Coherence. Whats Media Coherence? In its simplest terms... Your browser window? That's a window. Your flash player window on youtube? The flash player itself, displaying the video, is a sub-window. What keeps them in sync? Absolutely nothing. The events are handled separately and right now you just pray that they don't get processed too far apart. Which is why when you scroll on Youtube ,or other video sites with a video playing, sometimes everything tears and chunks.
                      This is funny because its true. I was affected by this bug since forever. Got to love X...

                      Comment


                      • #41
                        Originally posted by .CME. View Post
                        I have a question about multiple monitors and smooth playback of videos (and games).

                        I have a setup with the primary monitor running at 75hz and the second monitor (my TV) running at 24, 50 or 60hz. On nvidia i get smooth playback of videos on the TV while the TV frequency differs from the frequency of the primary monitor with vdpau and the video overlay. As soon as an older compositing manager is in use, i'll get tearing on the second monitor. With new compositors that support GLX_EXT_buffer_age i'll get a really bad jitter (dropped and duplicate frames) on both monitor, but it's tearfree, so playback of videos and games are not smooth on either monitors.

                        On Windows 7 playback is smooth on the primary monitor but not on the second as long as compositing is active.

                        How will this work on wayland?
                        There anyway for you to drop your primary monitor to 60hz, and set your tv to 60 as well, see if you get jittering then? Sounds like a synchronization issue due to the differing refresh rates

                        Comment


                        • #42
                          Originally posted by acrazyplayer View Post
                          I would be interested in this comparison. Thanks for the great article!
                          Glad to see the work is appreciated

                          Comment


                          • #43
                            Originally posted by Ericg View Post
                            Beyond cutting down the amount of work the GPU and CPU has to do, and taking advantage of modern hardware capabilities-- such as overlays. I do not believe there is anything fundamentally different in Wayland that would so suddenly make power consumption drop.

                            That being said you bring up a large variety of points...

                            1) You're using the radeon driver. Even in low-power mode the radeon driver still consumes more power than a low-power FGLRX.
                            2) You mention you're using an Intel CPU, but haven't mentioned what kernel you are using or if you are using Intel's new thermald. Between a 3.10 kernel using Intel's new P-State driver, rather than OnDemand, and enabling Thermald I too saw a 10-15degree drop in temperature in just ONE reboot on my Sandy Bridge Ultrabook.

                            Using the proprietary driver in a laptop is ALWAYS a good idea, where possible, if you care about temperature and battery life simply because no open source driver other than Intel actually has automatic power management capabilities

                            ok.

                            1): i was using both, i'm sorry if that wasn't clear.
                            2): i was using a range of kernels, it's true. on the desktop and thinkpad laptop i was using stock/custom(but same kernel version) ubuntu 12.10.
                            On the aspire one i was using an old 2.6.38(it's complicated-involves sabayon and ricompiling the whole system with -march=atom)

                            the samsung uses kernel 3.9.4 vanilla, with bumblebee and nvdia 310.xx . And yes i saw a drop in temp using intel scaling and thermald, a few 4 in the low freq-idle spectrum and some 2-3 at maximum(kernel build+ various instances of glxspheres up to the point of slowing down, both using the dedicated and integrated gpu. it was a double stress-test stability and maximu temp test to see if i would have problems in the summer.) But the behaviour is consistent across all kernels and distros and systems:using opengl raises the whole power usage of the system.

                            Comment


                            • #44
                              ok... i have a problem, when i write a post more than a few lines long, it says it needs moderator approval. if i write a shorter one and then edit it i'm fine.
                              Originally posted by Ericg View Post
                              Beyond cutting down the amount of work the GPU and CPU has to do, and taking advantage of modern hardware capabilities-- such as overlays. I do not believe there is anything fundamentally different in Wayland that would so suddenly make power consumption drop.

                              That being said you bring up a large variety of points...

                              1) You're using the radeon driver. Even in low-power mode the radeon driver still consumes more power than a low-power FGLRX.
                              2) You mention you're using an Intel CPU, but haven't mentioned what kernel you are using or if you are using Intel's new thermald. Between a 3.10 kernel using Intel's new P-State driver, rather than OnDemand, and enabling Thermald I too saw a 10-15degree drop in temperature in just ONE reboot on my Sandy Bridge Ultrabook.

                              Using the proprietary driver in a laptop is ALWAYS a good idea, where possible, if you care about temperature and battery life simply because no open source driver other than Intel actually has automatic power management capabilities


                              1): i was using both, i'm sorry if that wasn't clear.
                              2): i was using a range of kernels, it's true. on the desktop and thinkpad laptop i was using stock/custom(but same kernel version) ubuntu 12.10.
                              On the aspire one i was using an old 2.6.38(it's complicated-involves sabayon and ricompiling the whole system with -march=atom)

                              the samsung uses kernel 3.9.4 vanilla, with bumblebee and nvdia 310.xx . And yes i saw a drop in temp using intel scaling and thermald, a few 4 in the low freq-idle spectrum and some 2-3 at maximum(kernel build+ various instances of glxspheres up to the point of slowing down, both using the dedicated and integrated gpu. it was a double stress-test stability and maximum temp test to see if i would have problems in the summer.) But the behaviour is consistent across all kernels and distros and systems:using opengl raises the whole power usage of the system.

                              Do you share the view of Martin Grlin, that has been on a war against the definition of lightweight?
                              Last edited by sireangelus; 06-07-2013, 03:48 PM.

                              Comment


                              • #45
                                Originally posted by Ericg View Post
                                There anyway for you to drop your primary monitor to 60hz, and set your tv to 60 as well, see if you get jittering then? Sounds like a synchronization issue due to the differing refresh rates
                                60/N fps content is fine with both monitors at 60hz, but it won't be smooth with PAL or 24hz content on the TV.

                                Comment

                                Working...
                                X