Announcement

Collapse
No announcement yet.

Moving On From An X.Org World To Wayland

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • elanthis
    replied
    Originally posted by finalzone View Post
    Wayland is part of X.org. Wayland developers are X.org developers as well.
    http://lwn.net/Articles/539620/
    X.org is both an organization (X.org Foundation) and a software project (X.org server). Wayland is developed by the folks at X.org, but is a very different piece of software than X.org. The software release is what people are talking about, not the organization, as the organization really doesn't do anything.

    It's similar to how Apache refers both to the Apache Software Foundation and the Apache HTTPD Server, depending on context.

    Leave a comment:


  • finalzone
    replied
    Wayland is part of X.org. Wayland developers are X.org developers as well.

    Leave a comment:


  • curaga
    replied
    I thought theora and vp8 were native in FF? Has that changed?

    Leave a comment:


  • Thaodan
    replied
    Originally posted by kitsune
    Gstreamer?
    Triplet of mplayer/gnome-mplayer/gecko-mediaplayer don't need Gstreamer ;P

    Anyway, if object/buffers from client would be 24bit pixel data + 8bit alpha channel, then subpixel client side rendering of fonts shouldn't be a problem for compositor.
    Right?
    To enable video suppport without plugins you must build --with-gstreamer

    Leave a comment:


  • Thaodan
    replied
    Originally posted by kitsune
    Have no problem at playing this in Firefox...
    What ancient/buggy video plug-in you use?
    He hasn't enabled Gstreamer while building :P

    Leave a comment:


  • frign
    replied
    Originally posted by Ericg View Post
    Yes ever frame is perfect because they control whats in the buffers. If theres something wrong in the buffers then they (or the graphics drivers) fucked up. All Wayland does is take pointers and buffers and display their contents. How they got there, and whats in them (but not WHO put what in there, wayland keeps close tabs on buffer security) doessnt matter to the protocol.

    And X is complex because they wanted it to be as platform independent as possible, they were writing an operating system ONTOP OF an existing operating system (whatever flavor of unix you ran) That complexity is a bad thing. Wayland has the right idea: the parts that can never break (Wayland) have to be minimal so that one mistake doesnt impact a trillion other things. Wayland is made to get out of the way and anything "complex" (such as multiple GPU's) is "A client problem."

    If we ever hit a big changeup in the way we do graphics (Optimus) again in the future, it will help to ensure that the protocol isn't the problem. With X + Optimus the protocol WAS, and to an extent IS, the problem. Because instead of cluttering up the protocol we just introduce new libraries, new clients, and they handle the changes. All Wayland wants is pointers and buffers and a display to shove their contents onto.
    Not to forget the many radical changes in-kernel which required lots of reworking in X.

    Leave a comment:


  • Ericg
    replied
    Originally posted by frign View Post
    I am not completely into the Wayland-spec, but I am certain this is part of it. How did the devs put it? Every frame is perfect, and judging from my tests with GL-applications (like glgears), this works well.
    Yes ever frame is perfect because they control whats in the buffers. If theres something wrong in the buffers then they (or the graphics drivers) fucked up. All Wayland does is take pointers and buffers and display their contents. How they got there, and whats in them (but not WHO put what in there, wayland keeps close tabs on buffer security) doessnt matter to the protocol.

    And X is complex because they wanted it to be as platform independent as possible, they were writing an operating system ONTOP OF an existing operating system (whatever flavor of unix you ran) That complexity is a bad thing. Wayland has the right idea: the parts that can never break (Wayland) have to be minimal so that one mistake doesnt impact a trillion other things. Wayland is made to get out of the way and anything "complex" (such as multiple GPU's) is "A client problem."

    If we ever hit a big changeup in the way we do graphics (Optimus) again in the future, it will help to ensure that the protocol isn't the problem. With X + Optimus the protocol WAS, and to an extent IS, the problem. Because instead of cluttering up the protocol we just introduce new libraries, new clients, and they handle the changes. All Wayland wants is pointers and buffers and a display to shove their contents onto.

    Leave a comment:


  • frign
    replied
    Sub-Pixel-Rendering

    Originally posted by newwen View Post
    My point is that clients cannot render sub-pixels correctly to buffers if they don't know what context are they rendering to. I don't know if X server actually renders taking that into account, but ideally, clients could give the server context independent comands (as in postscript) which are then transformed and rendered by the server. Of course, this is not as fast as direct rendering by the client.
    I am not completely into the Wayland-spec, but I am certain this is part of it. How did the devs put it? Every frame is perfect, and judging from my tests with GL-applications (like glgears), this works well.

    Leave a comment:


  • newwen
    replied
    Originally posted by frign View Post
    No, it won't be.
    Context-resolving is mainly happening in the appropriate graphics-drivers, which handle their own context (even of multiple screens and modes).
    It is the task of the compositor to tell the drivers what to do, so the client-sided-implementation makes sense. No one really stops you from writing a lib that makes this handling easy.
    I am sure it would be simpler than the bloatware what the Xorg-Server is in many cases.
    My point is that clients cannot render sub-pixels correctly to buffers if they don't know what context are they rendering to. I don't know if X server actually renders taking that into account, but ideally, clients could give the server context independent comands (as in postscript) which are then transformed and rendered by the server. Of course, this is not as fast as direct rendering by the client.

    Leave a comment:


  • frign
    replied
    Messed up?

    Originally posted by newwen View Post
    How does Wayland handle multiple screens in "clone mode" with different subpixel geometries?

    If the client is responsible for antialising and subpixel rendering or some kind of transfor, if you have different kind of monitors connected to your graphics card or some kind of transformation on one of them, the image will be fucked up for one of them.

    Rendering performed by clients should be abstracted from output devices (the way postscript is for printers) and actual rendering should happen on the server.

    There's a reason X11 is complex, and I'm growing less convinced that Wayland is a good solution for Linux graphics.

    No, it won't be.
    Context-resolving is mainly happening in the appropriate graphics-drivers, which handle their own context (even of multiple screens and modes).
    It is the task of the compositor to tell the drivers what to do, so the client-sided-implementation makes sense. No one really stops you from writing a lib that makes this handling easy.
    I am sure it would be simpler than the bloatware what the Xorg-Server is in many cases.

    Leave a comment:

Working...
X