Announcement

Collapse
No announcement yet.

The Wayland Situation: Facts About X vs. Wayland

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Ericg View Post
    Mir, Wayland and SurfaceFlinger all have a requirement on an EGL driver. That being said, there is one non-standard extension to EGL that Wayland does want / require. As long as Intel, nVidia, and AMD all have an EGL stack they-- to my knowledge-- should work just fine across all three. With the small exception of: wayland wants an extra non-standard extension. I think its buffer_age but I'd have to double check that as well.
    First, thanks for the informative article. Some damage is done to Wayland by fanbois who call people like me Unix-tards, just because I've had several windows from other machines open all week on my desktop, and am not doing everything locally. Personally when I say I like "network transparent" I mean that exporting those windows over X to me has always looked faster than exporting them over VNC.

    I simply want to use compute cycles over there, display them here, and have it be fast. My job is chip design, not networked display protocols.

    Second, I'm almost interpreting your quote to mean that as long as the nVidia/AMD binary blobs support EGL (+ 1 non-standard extension) you can use the same driver that X uses. Is that true? (It sounds too good to be true.)

    Comment


    • #22
      so i do need a freaking EGL to work. So it means endless and countless driver event waking. On my laptop, having kde or xfce with compositing on and no effects makes about 1h15m of battery life. Can you tell me if wayland would provide some mitigation over this issue?

      Comment


      • #23
        Originally posted by Temar View Post
        Can you explain how an OpenGL application uses Wayland?

        Is it something like this?
        [application] -> [opengl] -> [wayland buffer] -> [egl] -> [driver]

        Who provides the OpenGL stack? If it is the GFX vendor, will he be able to hook into Wayland without having to open source his driver or GL stack?

        In other words: Are proprietary drivers possible without having to rewrite half of Wayland?

        The Driver provides the OpenGL stack. And it architecturally works the same way as ever. EGL is just another stack like OpenGL or DirectX stack, so no they dont have to open source their work to hook into Wayland.
        All opinions are my own not those of my employer if you know who they are.

        Comment


        • #24
          Originally posted by Ericg View Post
          Open source drivers should all be gold on Wayland-- I know Intel is because that's what is in my laptop.

          AMD and nVidia... We're gonna have to wait and see how they want to play. If they want to say "Screw Wayland! Go mir!" They could, they'd have to explicitly refuse to support the EGL extension that Wayland requires. But at the same time, Wayland might be able to work around that missing extension should that day come to pass.

          Ideally, in a year, the open source Radeon and Intel graphics should be basically up to par with their Windows brethren (maybe not OpenGL compliance, but hopefully power management for Radeon) and we therefore wouldn't NEED them to expressly support Wayland. nVidia is in an interesting position though with how crappy (in comparison) Nouveau is.
          Thank you very much for your answers and the article (this goes to Michael as well!). I'd also like to thank the whole Wayland team for their work! If only modules like X that drag linux down would died faster...

          Comment


          • #25
            Originally posted by sireangelus View Post
            so i do need a freaking EGL to work. So it means endless and countless driver event waking. On my laptop, having kde or xfce with compositing on and no effects makes about 1h15m of battery life. Can you tell me if wayland would provide some mitigation over this issue?
            Worst case scenario, you can run Wayland on Gallium. Your laptop has terrible battery life probably because the gpu isn't doing proper cstate switching, any discrete mobile gpu should be able to easily do desktop effects in xfce in its lowest power band. The alternative is you have a software renderer, and that is effectively what Gallium is anyway.

            Comment


            • #26
              Originally posted by sireangelus View Post
              so i do need a freaking EGL to work. So it means endless and countless driver event waking. On my laptop, having kde or xfce with compositing on and no effects makes about 1h15m of battery life. Can you tell me if wayland would provide some mitigation over this issue?
              Maybe the following can interest you, from the Wayland website:

              What is the drawing API?

              "Whatever you want it to be, honey". Wayland doesn't render on behalf of the clients, it expects the clients to use whatever means they prefer to render into a shareable buffer. When the client is, it informs the Wayland server of the new contents. The current test clients use either cairo software rendering, cairo on OpenGL or hardware accelerated OpenGL directly. As long as you have a userspace driver library that will let you render into a sharable buffer, you're good to go.

              Comment


              • #27
                Originally posted by phred14 View Post
                First, thanks for the informative article. Some damage is done to Wayland by fanbois who call people like me Unix-tards, just because I've had several windows from other machines open all week on my desktop, and am not doing everything locally. Personally when I say I like "network transparent" I mean that exporting those windows over X to me has always looked faster than exporting them over VNC.

                I simply want to use compute cycles over there, display them here, and have it be fast. My job is chip design, not networked display protocols.

                Second, I'm almost interpreting your quote to mean that as long as the nVidia/AMD binary blobs support EGL (+ 1 non-standard extension) you can use the same driver that X uses. Is that true? (It sounds too good to be true.)
                Basically, yes. Keep in mind EGL is just like OpenGL, OpenGL ES, or DirectX. Its just another video standard. The difference being EGL is targeted at Mobile, like OpenGL ES. Full desktop OpenGL is being worked on with Wayland, the problem is GLX pulls in all of X. So we need a new GL library for desktop GL.
                All opinions are my own not those of my employer if you know who they are.

                Comment


                • #28
                  Originally posted by schmidtbag View Post
                  I for some reason have wayland already installed on my system - some package in Arch requires it (I forget which).
                  Mesa automatically pulls in Wayland if it was compiled with Wayland support (Which it is by default now)
                  All opinions are my own not those of my employer if you know who they are.

                  Comment


                  • #29
                    Originally posted by zanny View Post
                    Worst case scenario, you can run Wayland on Gallium. Your laptop has terrible battery life probably because the gpu isn't doing proper cstate switching, any discrete mobile gpu should be able to easily do desktop effects in xfce in its lowest power band. The alternative is you have a software renderer, and that is effectively what Gallium is anyway.
                    my laptop is a Samsung Series 3 with an i3 3110m (ivy bridge)with xubuntu and dual graphics, with bumblebee working correctly and as intended.

                    It's not terrible battery life at all: with maximum power saving settings (tlp rulez) i can squeeze 6h browsing. with compositing and unity on ubuntu i had something around 3h, with sabayon(bumblebee out of the box) and kde i could probably get 4h30min tops, never had too in that moment.Xubuntu, with all of my tweaks for battery life set can get me at 6h10m mark with 5% battery.
                    Last edited by sireangelus; 07 June 2013, 02:22 PM.

                    Comment


                    • #30
                      Originally posted by sireangelus View Post
                      my laptop is a Samsung Series 3 with an i3 3110m (ivy bridge)with xubuntu and dual graphics, with bumblebee working correctly and as intended.

                      It's not terrible battery life at all: with maximum power saving settings (tlp rulez) i can squeeze 6h browsing. with compositing and unity on ubuntu i had something around 3h, with sabayon(bumblebee out of the box) and kde i could probably get 4h30min tops.
                      To be fair.... Unity + Compiz was a complete mess. You'd be a lot better off testing Gnome 3 vs KDE vs E17 for power consumption just because those are better maintained and optimized.
                      All opinions are my own not those of my employer if you know who they are.

                      Comment

                      Working...
                      X