Announcement

Collapse
No announcement yet.

The Wayland Situation: Facts About X vs. Wayland

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Thanks for the info Eric, lets hope more people read this article.

    I'll gladly switch to wayland as soon as either the catalyst drivers support it, or, when the open source radeon drivers support crossfire. If the open source radeon drivers catch up with openGL 4, I might switch anyway - so far there's nothing I use in linux that needs a 2nd GPU (the few things that could use it currently don't). I'm a KDE user and I get the impression wayland is usable with it. By the time either of my desires get completed, I'm sure wayland will run fine. The only problem is the catalyst devs barely keep up with the latest xserver, and crossfire support on radeon is considered a low priority. So it could be a while. However, being an end user with just 1 screen, X is serving me just fine for now.

    I think wayland's requirement of compositing will also help weed out some of the really old systems. It's always nice to have 1-size-fits-all, but after 2 decades, that can really hinder future technologies.

    I for some reason have wayland already installed on my system - some package in Arch requires it (I forget which).

    Comment


    • #17
      Originally posted by Ericg View Post
      Mir, Wayland and SurfaceFlinger all have a requirement on an EGL driver. That being said, there is one non-standard extension to EGL that Wayland does want / require. As long as Intel, nVidia, and AMD all have an EGL stack they-- to my knowledge-- should work just fine across all three. With the small exception of: wayland wants an extra non-standard extension. I think its buffer_age but I'd have to double check that as well.
      Can you explain how an OpenGL application uses Wayland?

      Is it something like this?
      [application] -> [opengl] -> [wayland buffer] -> [egl] -> [driver]

      Who provides the OpenGL stack? If it is the GFX vendor, will he be able to hook into Wayland without having to open source his driver or GL stack?

      In other words: Are proprietary drivers possible without having to rewrite half of Wayland?

      Comment


      • #18
        Originally posted by TheBlackCat View Post
        I have six questions:

        1. How are top-level windows and sub-surface windows kept synchronized, perhaps using flash in a web-browser as an example?

        2. What happens when part of a sub-surface window is obscured within a top-level window, such as using the scroll bar to move the flash animation above or below the top of the window?

        3. I assume sub-surfaces have to be part of another window, but can they be nested (i.e. a sub-surface window being part of another sub-surface window), or can sub-surface windows only be part of top-level windows?

        4. Do sub-surface windows have complete control over their own buffer, or can top-level windows manipulate one of its sub-surface window buffers before passing it to the compositor?

        5. Why is the coordinate counter 31 bits? That seems like a strange number.

        6. Is the coordinate counter count the total number of pixels, or the pixel along a particular axis? This isn't clear from the description.
        1) Surface and SubSurface windows are kept in sync through the protocol, I WANT to say they are kept in lockstep via the CPU but it could just as easily be a feature of Hardware Overlays.

        2) If they are handling it the same way they are minimize...they continue to render, this way the exact image is available at all times.

        3) Unsure.

        4) Unsure. I know there are security hooks to make sure clients do not mess with eachother's buffers. If SubSurface windows are considered apart of the same client, then yes they could manipulate it. If they are separate, then no.

        5) I meant to ask Daniel, but I forgot about it. X was an odd number too at 15, I can only assume that they are using the extra bit for something other than actual counting.

        6) 99% sure its total number of pixels, so X & Y together. I didn't find any information to make me think otherwise.

        Comment


        • #19
          What an awesome writeup!

          As someone who has used X for the longest while, since 1993 to be exact on some old AIX workstations, I am overjoyed with the prospect of Wayland and even more so with the kind of cohesive, ego-reduced effort the X & Wayland crew is putting toward finalizing Wayland, as well as XWayland. This couldn't have happened at a better time - it's like Apple switching from OS 9 to OS X - a gigantic (re)design leap forward that will pay off dividends for years to come, not to any specific company but to the whole world at large.

          I just wish Canonical didn't backpedal on their original Wayland commitment, we'd have gotten to the finish line much sooner, but it is what it is.

          Loved reading at more depth about the technical deficiencies of X and how Wayland addresses them. Thank you Phoronix!

          Comment


          • #20
            Originally posted by schmidtbag View Post
            Thanks for the info Eric, lets hope more people read this article.

            I for some reason have wayland already installed on my system - some package in Arch requires it (I forget which).
            Probably the toolkit (GTK+ for sure, I don't know Qt).
            Otherwise I have no idea... maybe are you a tester of KDE 4.11?
            Following the Martin's g+:
            KWin 4.11 just got a new experimental feature merged in for our users to play with :-)

            I'm not a huge fan of including experimental features, but in this case we need to make an exception. It's the last release of 4.x series and there are many users around who want to see that we are walking into that direction.
            That experimental feature is a Wayland backend, so if you are playing with KDE 4.11...

            Comment


            • #21
              Originally posted by Ericg View Post
              Mir, Wayland and SurfaceFlinger all have a requirement on an EGL driver. That being said, there is one non-standard extension to EGL that Wayland does want / require. As long as Intel, nVidia, and AMD all have an EGL stack they-- to my knowledge-- should work just fine across all three. With the small exception of: wayland wants an extra non-standard extension. I think its buffer_age but I'd have to double check that as well.
              First, thanks for the informative article. Some damage is done to Wayland by fanbois who call people like me Unix-tards, just because I've had several windows from other machines open all week on my desktop, and am not doing everything locally. Personally when I say I like "network transparent" I mean that exporting those windows over X to me has always looked faster than exporting them over VNC.

              I simply want to use compute cycles over there, display them here, and have it be fast. My job is chip design, not networked display protocols.

              Second, I'm almost interpreting your quote to mean that as long as the nVidia/AMD binary blobs support EGL (+ 1 non-standard extension) you can use the same driver that X uses. Is that true? (It sounds too good to be true.)

              Comment


              • #22
                so i do need a freaking EGL to work. So it means endless and countless driver event waking. On my laptop, having kde or xfce with compositing on and no effects makes about 1h15m of battery life. Can you tell me if wayland would provide some mitigation over this issue?

                Comment


                • #23
                  Originally posted by Temar View Post
                  Can you explain how an OpenGL application uses Wayland?

                  Is it something like this?
                  [application] -> [opengl] -> [wayland buffer] -> [egl] -> [driver]

                  Who provides the OpenGL stack? If it is the GFX vendor, will he be able to hook into Wayland without having to open source his driver or GL stack?

                  In other words: Are proprietary drivers possible without having to rewrite half of Wayland?

                  The Driver provides the OpenGL stack. And it architecturally works the same way as ever. EGL is just another stack like OpenGL or DirectX stack, so no they dont have to open source their work to hook into Wayland.

                  Comment


                  • #24
                    Originally posted by Ericg View Post
                    Open source drivers should all be gold on Wayland-- I know Intel is because that's what is in my laptop.

                    AMD and nVidia... We're gonna have to wait and see how they want to play. If they want to say "Screw Wayland! Go mir!" They could, they'd have to explicitly refuse to support the EGL extension that Wayland requires. But at the same time, Wayland might be able to work around that missing extension should that day come to pass.

                    Ideally, in a year, the open source Radeon and Intel graphics should be basically up to par with their Windows brethren (maybe not OpenGL compliance, but hopefully power management for Radeon) and we therefore wouldn't NEED them to expressly support Wayland. nVidia is in an interesting position though with how crappy (in comparison) Nouveau is.
                    Thank you very much for your answers and the article (this goes to Michael as well!). I'd also like to thank the whole Wayland team for their work! If only modules like X that drag linux down would died faster...

                    Comment


                    • #25
                      Originally posted by sireangelus View Post
                      so i do need a freaking EGL to work. So it means endless and countless driver event waking. On my laptop, having kde or xfce with compositing on and no effects makes about 1h15m of battery life. Can you tell me if wayland would provide some mitigation over this issue?
                      Worst case scenario, you can run Wayland on Gallium. Your laptop has terrible battery life probably because the gpu isn't doing proper cstate switching, any discrete mobile gpu should be able to easily do desktop effects in xfce in its lowest power band. The alternative is you have a software renderer, and that is effectively what Gallium is anyway.

                      Comment


                      • #26
                        Originally posted by sireangelus View Post
                        so i do need a freaking EGL to work. So it means endless and countless driver event waking. On my laptop, having kde or xfce with compositing on and no effects makes about 1h15m of battery life. Can you tell me if wayland would provide some mitigation over this issue?
                        Maybe the following can interest you, from the Wayland website:

                        What is the drawing API?

                        "Whatever you want it to be, honey". Wayland doesn't render on behalf of the clients, it expects the clients to use whatever means they prefer to render into a shareable buffer. When the client is, it informs the Wayland server of the new contents. The current test clients use either cairo software rendering, cairo on OpenGL or hardware accelerated OpenGL directly. As long as you have a userspace driver library that will let you render into a sharable buffer, you're good to go.

                        Comment


                        • #27
                          Originally posted by phred14 View Post
                          First, thanks for the informative article. Some damage is done to Wayland by fanbois who call people like me Unix-tards, just because I've had several windows from other machines open all week on my desktop, and am not doing everything locally. Personally when I say I like "network transparent" I mean that exporting those windows over X to me has always looked faster than exporting them over VNC.

                          I simply want to use compute cycles over there, display them here, and have it be fast. My job is chip design, not networked display protocols.

                          Second, I'm almost interpreting your quote to mean that as long as the nVidia/AMD binary blobs support EGL (+ 1 non-standard extension) you can use the same driver that X uses. Is that true? (It sounds too good to be true.)
                          Basically, yes. Keep in mind EGL is just like OpenGL, OpenGL ES, or DirectX. Its just another video standard. The difference being EGL is targeted at Mobile, like OpenGL ES. Full desktop OpenGL is being worked on with Wayland, the problem is GLX pulls in all of X. So we need a new GL library for desktop GL.

                          Comment


                          • #28
                            Originally posted by schmidtbag View Post
                            I for some reason have wayland already installed on my system - some package in Arch requires it (I forget which).
                            Mesa automatically pulls in Wayland if it was compiled with Wayland support (Which it is by default now)

                            Comment


                            • #29
                              Originally posted by zanny View Post
                              Worst case scenario, you can run Wayland on Gallium. Your laptop has terrible battery life probably because the gpu isn't doing proper cstate switching, any discrete mobile gpu should be able to easily do desktop effects in xfce in its lowest power band. The alternative is you have a software renderer, and that is effectively what Gallium is anyway.
                              my laptop is a Samsung Series 3 with an i3 3110m (ivy bridge)with xubuntu and dual graphics, with bumblebee working correctly and as intended.

                              It's not terrible battery life at all: with maximum power saving settings (tlp rulez) i can squeeze 6h browsing. with compositing and unity on ubuntu i had something around 3h, with sabayon(bumblebee out of the box) and kde i could probably get 4h30min tops, never had too in that moment.Xubuntu, with all of my tweaks for battery life set can get me at 6h10m mark with 5% battery.
                              Last edited by sireangelus; 06-07-2013, 02:22 PM.

                              Comment


                              • #30
                                Originally posted by sireangelus View Post
                                my laptop is a Samsung Series 3 with an i3 3110m (ivy bridge)with xubuntu and dual graphics, with bumblebee working correctly and as intended.

                                It's not terrible battery life at all: with maximum power saving settings (tlp rulez) i can squeeze 6h browsing. with compositing and unity on ubuntu i had something around 3h, with sabayon(bumblebee out of the box) and kde i could probably get 4h30min tops.
                                To be fair.... Unity + Compiz was a complete mess. You'd be a lot better off testing Gnome 3 vs KDE vs E17 for power consumption just because those are better maintained and optimized.

                                Comment

                                Working...
                                X