Announcement

Collapse
No announcement yet.

The Wayland Situation: Facts About X vs. Wayland

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by .CME. View Post
    I have a question about multiple monitors and smooth playback of videos (and games).

    I have a setup with the primary monitor running at 75hz and the second monitor (my TV) running at 24, 50 or 60hz. On nvidia i get smooth playback of videos on the TV while the TV frequency differs from the frequency of the primary monitor with vdpau and the video overlay. As soon as an older compositing manager is in use, i'll get tearing on the second monitor. With new compositors that support GLX_EXT_buffer_age i'll get a really bad jitter (dropped and duplicate frames) on both monitor, but it's tearfree, so playback of videos and games are not smooth on either monitors.

    On Windows 7 playback is smooth on the primary monitor but not on the second as long as compositing is active.

    How will this work on wayland?
    There anyway for you to drop your primary monitor to 60hz, and set your tv to 60 as well, see if you get jittering then? Sounds like a synchronization issue due to the differing refresh rates
    All opinions are my own not those of my employer if you know who they are.

    Comment


    • #42
      Originally posted by acrazyplayer View Post
      I would be interested in this comparison. Thanks for the great article!
      Glad to see the work is appreciated
      All opinions are my own not those of my employer if you know who they are.

      Comment


      • #43
        Originally posted by Ericg View Post
        Beyond cutting down the amount of work the GPU and CPU has to do, and taking advantage of modern hardware capabilities-- such as overlays. I do not believe there is anything fundamentally different in Wayland that would so suddenly make power consumption drop.

        That being said you bring up a large variety of points...

        1) You're using the radeon driver. Even in low-power mode the radeon driver still consumes more power than a low-power FGLRX.
        2) You mention you're using an Intel CPU, but haven't mentioned what kernel you are using or if you are using Intel's new thermald. Between a 3.10 kernel using Intel's new P-State driver, rather than OnDemand, and enabling Thermald I too saw a 10-15degree drop in temperature in just ONE reboot on my Sandy Bridge Ultrabook.

        Using the proprietary driver in a laptop is ALWAYS a good idea, where possible, if you care about temperature and battery life simply because no open source driver other than Intel actually has automatic power management capabilities

        ok.

        1): i was using both, i'm sorry if that wasn't clear.
        2): i was using a range of kernels, it's true. on the desktop and thinkpad laptop i was using stock/custom(but same kernel version) ubuntu 12.10.
        On the aspire one i was using an old 2.6.38(it's complicated-involves sabayon and ricompiling the whole system with -march=atom)

        the samsung uses kernel 3.9.4 vanilla, with bumblebee and nvdia 310.xx . And yes i saw a drop in temp using intel scaling and thermald, a few 4? in the low freq-idle spectrum and some 2?-3? at maximum(kernel build+ various instances of glxspheres up to the point of slowing down, both using the dedicated and integrated gpu. it was a double stress-test stability and maximu temp test to see if i would have problems in the summer.) But the behaviour is consistent across all kernels and distros and systems:using opengl raises the whole power usage of the system.

        Comment


        • #44
          ok... i have a problem, when i write a post more than a few lines long, it says it needs moderator approval. if i write a shorter one and then edit it i'm fine.
          Originally posted by Ericg View Post
          Beyond cutting down the amount of work the GPU and CPU has to do, and taking advantage of modern hardware capabilities-- such as overlays. I do not believe there is anything fundamentally different in Wayland that would so suddenly make power consumption drop.

          That being said you bring up a large variety of points...

          1) You're using the radeon driver. Even in low-power mode the radeon driver still consumes more power than a low-power FGLRX.
          2) You mention you're using an Intel CPU, but haven't mentioned what kernel you are using or if you are using Intel's new thermald. Between a 3.10 kernel using Intel's new P-State driver, rather than OnDemand, and enabling Thermald I too saw a 10-15degree drop in temperature in just ONE reboot on my Sandy Bridge Ultrabook.

          Using the proprietary driver in a laptop is ALWAYS a good idea, where possible, if you care about temperature and battery life simply because no open source driver other than Intel actually has automatic power management capabilities


          1): i was using both, i'm sorry if that wasn't clear.
          2): i was using a range of kernels, it's true. on the desktop and thinkpad laptop i was using stock/custom(but same kernel version) ubuntu 12.10.
          On the aspire one i was using an old 2.6.38(it's complicated-involves sabayon and ricompiling the whole system with -march=atom)

          the samsung uses kernel 3.9.4 vanilla, with bumblebee and nvdia 310.xx . And yes i saw a drop in temp using intel scaling and thermald, a few 4? in the low freq-idle spectrum and some 2?-3? at maximum(kernel build+ various instances of glxspheres up to the point of slowing down, both using the dedicated and integrated gpu. it was a double stress-test stability and maximum temp test to see if i would have problems in the summer.) But the behaviour is consistent across all kernels and distros and systems:using opengl raises the whole power usage of the system.

          Do you share the view of Martin Gr??lin, that has been on a war against the definition of lightweight?
          Last edited by sireangelus; 07 June 2013, 03:48 PM.

          Comment


          • #45
            Originally posted by Ericg View Post
            There anyway for you to drop your primary monitor to 60hz, and set your tv to 60 as well, see if you get jittering then? Sounds like a synchronization issue due to the differing refresh rates
            60/N fps content is fine with both monitors at 60hz, but it won't be smooth with PAL or 24hz content on the TV.

            Comment


            • #46
              Nice article, thanks for that.
              Originally posted by Ericg View Post
              Question for everyone else: Would you guys be interested in a systemd / SysV / Upstart comparison as well?
              That would be nice. If you do that please include OpenRC, too.

              Comment


              • #47
                Great article, thanks for taking the time to write it!

                Comment


                • #48
                  Originally posted by Ericg View Post
                  Question for everyone else: Would you guys be interested in a systemd / SysV / Upstart comparison as well? I was toying around with the idea in my head, hadn't made up my mind yet.
                  Yes, that would be very interesting!

                  Comment


                  • #49
                    Originally posted by sireangelus View Post
                    ok... i have a problem, when i write a post more than a few lines long, it says it needs moderator approval. if i write a shorter one and then edit it i'm fine.




                    1): i was using both, i'm sorry if that wasn't clear.
                    2): i was using a range of kernels, it's true. on the desktop and thinkpad laptop i was using stock/custom(but same kernel version) ubuntu 12.10.
                    On the aspire one i was using an old 2.6.38(it's complicated-involves sabayon and ricompiling the whole system with -march=atom)

                    the samsung uses kernel 3.9.4 vanilla, with bumblebee and nvdia 310.xx . And yes i saw a drop in temp using intel scaling and thermald, a few 4? in the low freq-idle spectrum and some 2?-3? at maximum(kernel build+ various instances of glxspheres up to the point of slowing down, both using the dedicated and integrated gpu. it was a double stress-test stability and maximum temp test to see if i would have problems in the summer.) But the behaviour is consistent across all kernels and distros and systems:using opengl raises the whole power usage of the system.

                    Do you share the view of Martin Gr??lin, that has been on a war against the definition of lightweight?
                    Gonna go in reverse order...

                    Of course there's a war against the definition of lightweight. Lightweight is relative. In comparison to Core X11 even EFL is "Heavyweight." What I consider lightweight and what you consider lightweight are gonna be different.

                    Did you check powertop's Tunables tab? See if all the powersaving features are correctly enabled, that may save you a few degrees. The other option, and its gonna sound stupid but... did you make sure all the fans and vents were clean? Its one of those stupid, obvious things that even the experienced tinkerers seem to forget sometimes. I'm not saying you're -WRONG- about OpenGL raising the system-temp, but its not something I've experienced. Granted that may be because I'm on Sandy Bridge with no discrete GPU therefore CPU or GPU doesn't matter, the chip has to be woken up regardless.
                    All opinions are my own not those of my employer if you know who they are.

                    Comment


                    • #50
                      Q Will Wayland include a firewall you let you add one in to it?

                      Q Will Wayland have all the tool kits Needed to develop in desktop and Mobile space aswell? by the time Wayland releases?

                      Q Will Wayland space toilets (like Mir)

                      Comment

                      Working...
                      X