Announcement

Collapse
No announcement yet.

The Wayland Situation: Facts About X vs. Wayland

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Ericg View Post
    To be fair.... Unity + Compiz was a complete mess. You'd be a lot better off testing Gnome 3 vs KDE vs E17 for power consumption just because those are better maintained and optimized.
    i know that. I also noticed a big increase in battery life simply by using gnome classic over unity. In fact i'm running xubuntu right now while i'm writing.

    Comment


    • #32
      Originally posted by sireangelus View Post
      i know that. I also noticed a big increase in battery life simply by using gnome classic over unity. In fact i'm running xubuntu right now while i'm writing.
      I saw that, I was just pointing out to you (and other readers) that your one data-point was inherently flawed and not statistically valid. I'm glad you're getting good battery life with XFCE though.
      All opinions are my own not those of my employer if you know who they are.

      Comment


      • #33
        ok. so i used linux recently on this set of hardware:

        A desktop with core2duo, hd4850
        a thinkpad x61, core2duo 2ghz, intel x3100
        Acer Aspire one A110(original n270 atom)
        the aforementioned samsung.
        Small take: on the desktop, with proprietary drivers, disabling the effects has the ability to silence the fan of the videocard.
        on the thinkpad, it was a 10-20? difference in heat( i dropped it and it cause some problems with the cooling)
        Atom: faster system overall, 40 min with the original battery, 1h more with the extended(so it's like, 1,40 to 2,30h and 3.30h to 4,40)
        on the samsung, i've already wrote it.

        it seems to me it's a little less than one data point. And i must say also that i'm used to build and load my own kernels... and that squeezes a little bit more of performance and battery life out of them.

        Comment


        • #34
          ok. so i used linux recently on this set of hardware:

          A desktop with core2duo, hd4850
          a thinkpad x61, core2duo 2ghz, intel x3100
          Acer Aspire one A110(original n270 atom)
          the aforementioned samsung.
          Small take: on the desktop, with proprietary drivers, disabling the effects has the ability to silence the fan of the videocard. The frequencies where checked and even force down with a custom bios. Such problem presented both with open source in low power mode and fxglxr
          on the thinkpad, it was a 10-20? difference in heat( i dropped it and it cause some problems with the cooling)
          Atom: faster system overall, 40 min with the original battery, 1h more with the extended(so it's like, 1,40 to 2,30h and 3.30h to 4,40)
          on the samsung, i've already wrote it.

          it seems to me it's a little less than one data point. And i must say also that i'm used to build and load my own kernels... and that squeezes a little bit more of performance and battery life out of them.

          Do wayland do anything to reduce cpu/gpu usage in a way that makes easier for the driver to go to sleep faster and more often?


          ---------------------inserting-line-for-duplicated-post-warning----------------------
          Last edited by sireangelus; 07 June 2013, 02:55 PM.

          Comment


          • #35
            I have a question about multiple monitors and smooth playback of videos (and games).

            I have a setup with the primary monitor running at 75hz and the second monitor (my TV) running at 24, 50 or 60hz. On nvidia i get smooth playback of videos on the TV while the TV frequency differs from the frequency of the primary monitor with vdpau and the video overlay. As soon as an older compositing manager is in use, i'll get tearing on the second monitor. With new compositors that support GLX_EXT_buffer_age i'll get a really bad jitter (dropped and duplicate frames) on both monitor, but it's tearfree, so playback of videos and games are not smooth on either monitors.

            On Windows 7 playback is smooth on the primary monitor but not on the second as long as compositing is active.

            How will this work on wayland?

            Comment


            • #36
              Originally posted by Ericg View Post
              5) I meant to ask Daniel, but I forgot about it. X was an odd number too at 15, I can only assume that they are using the extra bit for something other than actual counting.
              I thought it was simply because they're using signed integers...?

              Comment


              • #37
                Originally posted by dee. View Post
                I thought it was simply because they're using signed integers...?
                Very well could be, again I meant to ask Daniel but forgot about it.
                All opinions are my own not those of my employer if you know who they are.

                Comment


                • #38
                  Originally posted by sireangelus View Post
                  ok. so i used linux recently on this set of hardware:

                  A desktop with core2duo, hd4850
                  a thinkpad x61, core2duo 2ghz, intel x3100
                  Acer Aspire one A110(original n270 atom)
                  the aforementioned samsung.
                  Small take: on the desktop, with proprietary drivers, disabling the effects has the ability to silence the fan of the videocard. The frequencies where checked and even force down with a custom bios. Such problem presented both with open source in low power mode and fxglxr
                  on the thinkpad, it was a 10-20? difference in heat( i dropped it and it cause some problems with the cooling)
                  Atom: faster system overall, 40 min with the original battery, 1h more with the extended(so it's like, 1,40 to 2,30h and 3.30h to 4,40)
                  on the samsung, i've already wrote it.

                  it seems to me it's a little less than one data point. And i must say also that i'm used to build and load my own kernels... and that squeezes a little bit more of performance and battery life out of them.

                  Do wayland do anything to reduce cpu/gpu usage in a way that makes easier for the driver to go to sleep faster and more often?
                  Beyond cutting down the amount of work the GPU and CPU has to do, and taking advantage of modern hardware capabilities-- such as overlays. I do not believe there is anything fundamentally different in Wayland that would so suddenly make power consumption drop.

                  That being said you bring up a large variety of points...

                  1) You're using the radeon driver. Even in low-power mode the radeon driver still consumes more power than a low-power FGLRX.
                  2) You mention you're using an Intel CPU, but haven't mentioned what kernel you are using or if you are using Intel's new thermald. Between a 3.10 kernel using Intel's new P-State driver, rather than OnDemand, and enabling Thermald I too saw a 10-15degree drop in temperature in just ONE reboot on my Sandy Bridge Ultrabook.

                  Using the proprietary driver in a laptop is ALWAYS a good idea, where possible, if you care about temperature and battery life simply because no open source driver other than Intel actually has automatic power management capabilities
                  All opinions are my own not those of my employer if you know who they are.

                  Comment


                  • #39
                    Ericg

                    Question for everyone else: Would you guys be interested in a systemd / SysV / Upstart comparison as well? I was toying around with the idea in my head, hadn't made up my mind yet.
                    I would be interested in this comparison. Thanks for the great article!

                    Comment


                    • #40
                      A) Media Coherence. Whats Media Coherence? In its simplest terms... Your browser window? That's a window. Your flash player window on youtube? The flash player itself, displaying the video, is a sub-window. What keeps them in sync? Absolutely nothing. The events are handled separately and right now you just pray that they don't get processed too far apart. Which is why when you scroll on Youtube ,or other video sites with a video playing, sometimes everything tears and chunks.
                      This is funny because its true. I was affected by this bug since forever. Got to love X...

                      Comment

                      Working...
                      X