Announcement

Collapse
No announcement yet.

Wayland's Weston Gets A Remoting Plugin For Virtual Output Streaming

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Wayland's Weston Gets A Remoting Plugin For Virtual Output Streaming

    Phoronix: Wayland's Weston Gets A Remoting Plugin For Virtual Output Streaming

    While Wayland/Weston development might be lightening up a bit for now with Samsung OSG closing up shop and they being one of the major drivers in recent years to this stack, fortunately, other developers remain. Tomohito Esaki of IGEL endpoint management solutions has introduced a remoting plugin with output streaming for Weston...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Mutter need something like this, but with supporting Spice clients, so audio and USB would be also covered.

    Comment


    • #3
      Does anyone know if this is like VNC (rasterized) or if it is like X11/RDP (intelligent protocol)?

      If it is an intelligent protocol then that makes it useful for accessing graphics from a server (or emulator) providing a GPU (physical passthrough or emulated).
      If it is just rasterization, it will simply say no display adapter found and much less useful than even the ancient X11 (or X10!) design.

      My biggest worry about Weston is that it regresses in all these fantastic features that the smart people of early UNIX developed.

      Comment


      • #4
        Originally posted by kpedersen View Post
        Does anyone know if this is like VNC (rasterized) or if it is like X11/RDP (intelligent protocol)?

        If it is an intelligent protocol then that makes it useful for accessing graphics from a server (or emulator) providing a GPU (physical passthrough or emulated).
        If it is just rasterization, it will simply say no display adapter found and much less useful than even the ancient X11 (or X10!) design.

        My biggest worry about Weston is that it regresses in all these fantastic features that the smart people of early UNIX developed.
        It made it rather clear in the article that jpeg images are sent over a stream, so rasterized like vnc I would say.

        You can get dummy outputs to have a display output rendered in the circumstance you're bringing up, costs a few dollars for one of those, I use it for GPU passthrough VM. Alternatively, there is that recent addition(in kernel I think) for virtual display driver iirc, and that should let you create a fake output to render to, avoiding the no display adapter issue.

        Comment


        • #5
          So every one of the gazillion Wayland implementations/desktop environments is going to have its own incompatable remote desktop technology. So its not like X where you can use the same x11vnc server with any desktop environment you want. Wow, thats a big improvement, what did we do without Wayland? Plus Wayland has no app<->server network transparency like X has. Great, just great.

          Really, whats the point in Wayland, it doesnt do anything better than X does, its more insecure (video hardware drivers and GPU attack surface in applications), it doesnt have the basic features that X has had for years. Its monolithic with the window manager and display server wrapped up in one big ugly mess rather than the nice clean window manager and display server seperation in X which allows you to change your window manager on the fly. In every way, its a downgrade from X. It seems like with Wayland there is a lot of useless wheel reinventing going on to recreate features that X has had for years instead of developers actually focusing on developing something Linux does not have yet. What a waste of time and effort.

          Comment


          • #6
            Originally posted by kpedersen View Post
            Does anyone know if this is like VNC (rasterized) or if it is like X11/RDP (intelligent protocol)?

            If it is an intelligent protocol then that makes it useful for accessing graphics from a server (or emulator) providing a GPU (physical passthrough or emulated).
            If it is just rasterization, it will simply say no display adapter found and much less useful than even the ancient X11 (or X10!) design.

            My biggest worry about Weston is that it regresses in all these fantastic features that the smart people of early UNIX developed.
            Have you ever tried streaming a modern app with X11, though? Say a game, web browser or a video player. Heck, even a spreadsheet app!

            This doesn't work well at all, and I'd rather have a video stream with a modern codec like h.264/5/VP9/AV1 (with hardware encoding + decoding). Only damaged/changed parts are sent over the wire, and that should work very well.
            Even plain X11 apps are unsatisfactory for me, when they update hundreds of widgets, which can take a few seconds before the window is responsive.

            This is an interesting use case for Miracast-type technology. Run a server on your desktop, a client on your laptop (or vice-versa, the terminology is up to you), and use your laptop's screen as an external display. I can see a few other ways to achieve the same effect, but that would be interesting nevertheless.

            Of course, best would be to add a wayland server to sshd, and ssh would act as a wayland client (something similar could also be achieved without modifying sshd). Then add a wayland protocol for toolkits to announce and negociate custom remoting capabilities, and you've found yourself an alternative that's a whole lot better than X11. I am certain something like this will surface in due time!

            In the meantime, I find VNC-type streaming almost superior to X11, with the exception of a few nitpicks. A basic wayland-forwarding software would already be nice, that would forward some of the protocol (ssd necociation for instance), and stream back the images. That acutally makes me sort of want to try and write one

            edit for jpg44 's post: an architecture such as the one described above would work with any wayland-compatible server and client. And yes, Wayland does solve a lot of issues (I could quote some, but that has been done and redone already); and the mere fact that multiple implementations of the protocol exist proves how simple and understandable it is.
            I can't comprehend why you are bringing up the attack surface in a Wayland vs. X post, yet manage to say that X is superior in that regard? Seriously? And I would be glad to be enlightened with specific examples of how wayland makes the system more vulnerable.

            One point I will concede you is that we've spent time reimplementing stuff that worked "fine" (most of the time) before. But your argumentation points are really similar to a copper-vs-fiber argument, or a charcoal-vs-electricity one.
            Last edited by M@yeulC; 30 October 2018, 09:43 AM.

            Comment


            • #7
              Originally posted by M@yeulC View Post

              edit for jpg44 's post: an architecture such as the one described above would work with any wayland-compatible server and client. And yes, Wayland does solve a lot of issues (I could quote some, but that has been done and redone already); and the mere fact that multiple implementations of the protocol exist proves how simple and understandable it is.
              I can't comprehend why you are bringing up the attack surface in a Wayland vs. X post, yet manage to say that X is superior in that regard? Seriously? And I would be glad to be enlightened with specific examples of how wayland makes the system more vulnerable.

              One point I will concede you is that we've spent time reimplementing stuff that worked "fine" (most of the time) before. But your argumentation points are really similar to a copper-vs-fiber argument, or a charcoal-vs-electricity one.
              An X application which is using the GLX protocol extension indeed does not have any contact with video hardware, it kind of does provide an extra layer of security. With Video Hardware in the application, you trust the GPU will be bug free and will carefully provide access controls for what an app can do.

              X has had security issues, but extensions can be done to fix them. . One weakness of X has been the fact is that X clients could introspect other X clients windows. A security extension to allow fine grained control over what X clients can access and give only certain X clients access to anothers windows, input events, etc, could be handy. Some programs like a screenshot app or the window manager need such access but others do not.

              What would make Wayland more palatable is to have app<-> server network transparency. What i gather is that wayland applications have a DRI video driver in them and that OpenGL commands are sent to the video driver and then to video hardware directly from the app, painted to a video buffer in GPU. The Wayland Display Server composites all of the buffers together in the GPU. Hopefully the driver that Wayland application uses can be switched at runtime.

              A "network transparency" driver could be provided that is loaded into Wayland app that would send OpenGL calls over a stream protocol to a server which would then contain actual video drivers to forward commands to and would also forward Wayland commands to Wayland server. This would provide app<-> server network network transparency.

              Also another desirable thing is a "rootless" wayland server that can display individual Wayland app windows onto an X server so that Wayland apps can be used in an X desktop environment.

              Comment


              • #8
                Originally posted by kpedersen View Post
                Does anyone know if this is like VNC (rasterized) or if it is like X11/RDP (intelligent protocol)?
                Intelligence of X11:
                1. draw this rasterized bitmap at position x/y.
                2. have you finished drawing the previous bitmap?
                3. ok, here is the next bitmap (go back to 1., repeat for every changed screen area)
                X11 can do more, but nobody is interested in drawing solid rectangles and non-antialiased lines and text ...

                RDP can do a lot more (including extra channels for e.g. device/audio/... forwarding, but lets concentrate on the graphics part):
                1. it sends changed areas of the screen to the client
                  1. it uses caching/deltas based on the old content
                  2. it provides several optimized encodings for rasterized screen contents, especially text on solid background
                2. it can forward media streams (e.g. h.264) to the client
                1. is not very different to VNC from the architecture, only more evolved. For both VNC and RDP, different encodings are optional, sending raw rasterized fullscreen updates is completely compliant for both VNC and RDP.

                2. is not supported by any application on the Linux side currently. Also for RDP on Windows this has to be specifically supported by the application.

                If it is an intelligent protocol then that makes it useful for accessing graphics from a server (or emulator) providing a GPU (physical passthrough or emulated).
                If it is just rasterization, it will simply say no display adapter found and much less useful than even the ancient X11 (or X10!) design.

                My biggest worry about Weston is that it regresses in all these fantastic features that the smart people of early UNIX developed.
                You know how game streaming services work? - they render into an offscreen bitmap, compress it and send it to the client.

                Streaming rendered content is about ~10 Mbit/s, streaming the to be rendered data is several GByte/s (if you leave glxgears behind).

                Comment


                • #9
                  Originally posted by jpg44 View Post
                  An X application which is using the GLX protocol extension indeed does not have any contact with video hardware, it kind of does provide an extra layer of security. With Video Hardware in the application, you trust the GPU will be bug free and will carefully provide access controls for what an app can do.
                  Afaik applications were calling OGL on their own and sending X pre-rendered frames. You have some examples of applications using GLX?

                  Comment


                  • #10
                    Originally posted by jpg44 View Post

                    An X application which is using the GLX protocol extension indeed does not have any contact with video hardware, it kind of does provide an extra layer of security. With Video Hardware in the application, you trust the GPU will be bug free and will carefully provide access controls for what an app can do.

                    X has had security issues, but extensions can be done to fix them. . One weakness of X has been the fact is that X clients could introspect other X clients windows. A security extension to allow fine grained control over what X clients can access and give only certain X clients access to anothers windows, input events, etc, could be handy. Some programs like a screenshot app or the window manager need such access but others do not.

                    What would make Wayland more palatable is to have app&lt;-&gt; server network transparency. What i gather is that wayland applications have a DRI video driver in them and that OpenGL commands are sent to the video driver and then to video hardware directly from the app, painted to a video buffer in GPU. The Wayland Display Server composites all of the buffers together in the GPU. Hopefully the driver that Wayland application uses can be switched at runtime.

                    A "network transparency" driver could be provided that is loaded into Wayland app that would send OpenGL calls over a stream protocol to a server which would then contain actual video drivers to forward commands to and would also forward Wayland commands to Wayland server. This would provide app&lt;-&gt; server network network transparency.

                    Also another desirable thing is a "rootless" wayland server that can display individual Wayland app windows onto an X server so that Wayland apps can be used in an X desktop environment.
                    Thank you for the thoughtful answer.

                    What you said about extensions is true. However, this would break (and does when I ssh -X instead of ssh -Y) many programs, so it isn't really possible to fix within X without breaking compatibility. IIRC, that was one of the considerations that led to Wayland being developed.

                    Take the following with a grain of salt, as I am not too sure about GLX's implementation:
                    As far as I know, the apps usually perform OpenGL calls trough a shared library (Mesa for instance), that uses DRI itself. You could probably sandbox DRI from the application, if you wanted (or even use something like virtualGL).
                    You're technically right by "wayland applications have a DRI video driver in them", if you talk about the shared library that's being loaded inside the application. I get your point about GLX providing an extra layer of isolation. However, I believe most X applications behave the same as Wayland ones? So regarding "direct access" to hardware, it remains exactly the same under Wayland. And you always have to trust the OpenGL driver not to lock up your GPU, in any case (actually, in theory, you should also trust the kernel drivers to avoid this in the first place, and recover if it ever happens).
                    Display access and control trough DRI is restricted to only one app, I think (if you exclude DRI leases), so a random application messing with it shouldn't be of a concern.

                    As far as I know, the shared library thing makes it impossible to change the OpenGL driver at runtime (after the application has started). Plus, it would be a lot of work in any case (including GLX) to carry all of the context state over to another driver... Although it could probably be achievable for multiple mesa drivers (?). But the window system doesn't matter at all, AFAIK.

                    I don't think it would take much modifying VirtualGL to achieve what you are describing. However, I have to question the utility of such a thing: usually you want to perform the rendering on the (often much more powerful) machine you're launching a program on, rather than the one it is being displayed on. And modern OpenGL usage (especially with modern games) make it a lot less bandwidth-intensive to stream the rendered frames than the primitives (textures,models,shaders,etc), as far as I know, and that's not counting the increased latency that would affect vsync, compute shaders, etc. "network transparency" can instead be achieved for the user, if not at the protocol level.

                    I agree with the last part regarding a library that allows Wayland applications to be displayed under X. That doesn't seem extremely complicated to me.

                    Regarding GLX, the library is linked to libGLdispatch.so.0 on my system, which might or might not load additionnal "DRI drivers". Unfortunately, I can't say much about its mechanisms short of reading the code.

                    I based some of my answers on https://en.wikipedia.org/wiki/GLX (esp. the graphs) and https://en.wikipedia.org/wiki/Waylan...endering_model https://github.com/NVIDIA/libglvnd
                    The more I write, the more I realize how little I know... But I feel like most of these points still stand.

                    Comment

                    Working...
                    X