Announcement

Collapse
No announcement yet.

LXDE-Based Lubuntu Will Not Ship Mir Display Server

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Does anyone know if running LXDE or XFCE on top or mir/xmir would avoid video-tearing (both these desktops tear when running on native X without any opengl composting). IMO that would be reason enough for lubuntu and xubuntu to consider Xmir if it did fix the video tearing issue.

    Comment


    • #22
      Originally posted by Ericg View Post
      No.. I translate Red Hat backing Wayland as GTK saying "Screw Mir." Red Hat doesn't have complete control over GTK, but they have enough control and influence that if RH told the head devs to "forget" about pulling the patches or to "misplace" the links to the patches, I have little doubt that it'd happen.
      I really need to learn to stop posting late at night... Correction bolded.
      All opinions are my own not those of my employer if you know who they are.

      Comment


      • #23
        Both the mailing list link and the blog link take you to the blog, lol.

        Comment


        • #24
          Originally posted by MWisBest View Post
          Both the mailing list link and the blog link take you to the blog, lol.
          The blog has a link at the bottom to the mailing list if you wanted to read the thread.
          All opinions are my own not those of my employer if you know who they are.

          Comment


          • #25
            Originally posted by liam View Post
            Can I assume you aren't familiar with the Wayland backend for RPI?
            The performance, from the videos is better than it was on X.
            Actually, you are right, I'm not familiar with RPi backend, but it's not a problem of general performance (as I stated, in most cases *performance* as in CPU and GPU use is better when using compositing) but a problem with *memory use*. This flavor, Lubuntu, is supposed to work well on *desktop* systems (where being able to open a big amount of apps is usually desired) with memory as low as 128MB IIRC. Compositing will always require more memory than non-compositing, if both are done right.
            I'm not really sure how many apps can the Raspberry Pi have opened at the same time, or how much memory it has, but with compositing you will use more memory for graphics as you open more apps and without compositing you won't. And that's why in Windows versions older than Vista and in X without compositing you get white zones when an app hangs up and you move another window through its surface, because it doesn't have a copy of the contents and it has to render it again, but the app is locked and can't tell what needs to be inside the window. In compositing, you will have at least n+1 surfaces (which consume memory), where n is the number of apps you have opened, and in non-compositing you will have only one surface with the the dimensions of your screen (in some cases, horizontal length is rounded to the next power of two and the excess is just ignored, I'm not sure why, but it's that way in some cases).

            Originally posted by Candide View Post
            There is a question I've been meaning to ask, and this is as good a time as any...

            Is it possible that applications could be written so that they run on both Mir and Wayland? I don't mean via XMir and XWayland, I mean could a native app for Mir work on Wayland, and vice-versa. Or are developers going to be forced to redo their apps for the specific window manager?

            Thanks in advance for replies.
            What Ericg said (leaving aside what the toolkits said, since Canonical said already they'll be porting the toolkits if upstream doesn't do that, and Qt is needed by Unity IIRC, so they'll be porting that at the very least) is valid, except for window managers. They need to explicitly have a backend for every display system they need to support, because they *need* to talk directly to that. That's why on the KDE side there are problems with Kwin, but nobody said anything about the other pieces of that software distribution.

            Originally posted by Thaodan View Post
            All Ubuntu derivates should group and make a distrubtion.
            They should ship an installer like Suse where you can select your DE (which pulls the setups of that are currently distrubtions).
            The downside of that approach would be a bigger image (thus, a longer download time) for packages you are not likely to use (usually, a given user is comfortable with a given desktop, and just ignores the others). Aside from the packaging of everything in the same image (and the inability to leverage the *buntu name to gather new users, but this wouldn't affect the current user base in the least), I see nothing too problematic on your suggestion.

            Originally posted by bwat47 View Post
            Does anyone know if running LXDE or XFCE on top or mir/xmir would avoid video-tearing (both these desktops tear when running on native X without any opengl composting). IMO that would be reason enough for lubuntu and xubuntu to consider Xmir if it did fix the video tearing issue.
            Both LXDE and XFCE are unable to run on pure Mir, and running on top of XMir wouldn't avoid any of the problems it has in X. At most, if the implementation of this layer is more efficient (which is not likely, because nobody put emphasis on X*whatever* development aside from giving the ability of running X apps) than pure X and Mir is non-intrusive enough, you'd have a slightly better performance. But that's all the benefits you *could* have, and none of them is likely. The best I'd expect is the same or a wee bit worse performance, and everything else behaving just the same than in X.org.

            EDIT: Anyway, if not using compositing, tearing is expectable because it redraws everything at every frame. XMir have an independent X server inside. If this X server doesn't have compositing enabled, you will have exactly the same problem. If it's enabled, you will not face those problem, but you won't if you enable compositing on X either (Openbox doesn't support compositing, so the closest you can get is using Fluxbox git, which has got compositing since 2012 or 2011).

            On the memory consumption problem, when compositing bypassing (on Mir) gets implemented, Lubuntu will have the memory consumption of two surfaces with equal (or corrected to next power of two, as I said earlier) dimensions as the screen, one for the root window and one for the full-screen X server, plus whatever the X server and the Mir server take by themselves, but if compositing in X is disabled you will not see extra surfaces memory consumption. So, when that happens I won't expect memory to go up more than a few MBs when running on XMir compared to running on pure X.org.
            Last edited by mrugiero; 30 June 2013, 04:07 AM.

            Comment


            • #26
              Originally posted by mrugiero View Post
              They want to avoid the use of compositing, and both native Mir and native Wayland require compositing, and both XMir and Xwayland are useless as a desktop solution (useless as not giving nothing that a plain X.org doesn't give you). And this is, as pointed out, because they aim to low resources computers. Compositing has better performance on a lot of use cases, but it also leads to higher memory use.
              Why exactly is it that Wayland *requires* composition? The way I understand, a Wayland compositor simply receives the buffers from the client, then combines them into one buffer, and passes this buffer on to the system compositor / writes it directly to screen via the video drivers. Usually this combining is done by composition, but is there something in the protocol that mandates it has to be done so?

              Comment


              • #27
                Originally posted by dee. View Post
                Why exactly is it that Wayland *requires* composition? The way I understand, a Wayland compositor simply receives the buffers from the client, then combines them into one buffer, and passes this buffer on to the system compositor / writes it directly to screen via the video drivers. Usually this combining is done by composition, but is there something in the protocol that mandates it has to be done so?
                If I get it correctly, except for full-screen apps, Wayland mandates composition. Is required for the 'every frame is perfect' policy. If you do otherwise, you will probably observe the same artifacts that were mentioned for X without compositing.
                It's important to notice compositing != special effects. This combination of buffers is actually the compositing, and it requires more memory because those buffers need to exist first to be combined. As the engineer with the j nickname (I can't remember the full name because it's unpronounceable in my mother tongue) said, when using a full-screen app you just point to the buffer where the contents are being drawn, so there's no compositing involved. I meant for the common desktop usage, where use of windows is the norm (and is actually more likely to be so on lower-end desktop computers, since most full-screen apps targeted for desktop are games and videos, which would be harder to use on lower-end computers).

                As I already said, I don't know how the RPi backend is supposed to work. If it's like in most tablets and smartphones, with a single app taking full-screen all the time, then you can safely disable compositing for Wayland. But you can't for desktop, and this means a greater memory usage.

                Comment


                • #28
                  Originally posted by mrugiero View Post
                  If I get it correctly, except for full-screen apps, Wayland mandates composition. Is required for the 'every frame is perfect' policy. If you do otherwise, you will probably observe the same artifacts that were mentioned for X without compositing.
                  It's important to notice compositing != special effects. This combination of buffers is actually the compositing, and it requires more memory because those buffers need to exist first to be combined. As the engineer with the j nickname (I can't remember the full name because it's unpronounceable in my mother tongue) said, when using a full-screen app you just point to the buffer where the contents are being drawn, so there's no compositing involved. I meant for the common desktop usage, where use of windows is the norm (and is actually more likely to be so on lower-end desktop computers, since most full-screen apps targeted for desktop are games and videos, which would be harder to use on lower-end computers).

                  As I already said, I don't know how the RPi backend is supposed to work. If it's like in most tablets and smartphones, with a single app taking full-screen all the time, then you can safely disable compositing for Wayland. But you can't for desktop, and this means a greater memory usage.
                  Oh, so you mean the buffers themselves are what cause the greater memory usage. Gotcha.

                  Comment


                  • #29
                    Originally posted by dee. View Post
                    Oh, so you mean the buffers themselves are what cause the greater memory usage. Gotcha.
                    Of course. AFAIK, Wayland itself is a lot lighter than X.org.

                    Comment


                    • #30
                      Originally posted by Ibidem View Post
                      I knew Gtk and KDE did that, but don't remember reading that Qt did, and my google-fu isn't good enough to find an article saying anything about upstream. Mind posting a link?
                      Qt's official stance is that since Qt 5.0 all platform adaptions are plugins anyway. In the past Qt itself had to be modified to support other display servers but the Lighthouse project added the plugin interface to Qt 4.x and now Qt 5.x solely uses that plugin interface. Canonical wrote a preliminary plugin: https://launchpad.net/qmir

                      I'm not aware that Canonical made any attempt to upstream that plugin to Qt Project.
                      However in any case at least regarding pure Qt applications there should be no problem running the same binary under a Wayland or a Mir system. Normal applications should not talk to the display server directly. Usually only window managers do that.

                      Comment

                      Working...
                      X