Does anyone know if running LXDE or XFCE on top or mir/xmir would avoid video-tearing (both these desktops tear when running on native X without any opengl composting). IMO that would be reason enough for lubuntu and xubuntu to consider Xmir if it did fix the video tearing issue.
I really need to learn to stop posting late at night... Correction bolded.
Originally Posted by Ericg
Both the mailing list link and the blog link take you to the blog, lol.
The blog has a link at the bottom to the mailing list if you wanted to read the thread.
Originally Posted by MWisBest
Actually, you are right, I'm not familiar with RPi backend, but it's not a problem of general performance (as I stated, in most cases *performance* as in CPU and GPU use is better when using compositing) but a problem with *memory use*. This flavor, Lubuntu, is supposed to work well on *desktop* systems (where being able to open a big amount of apps is usually desired) with memory as low as 128MB IIRC. Compositing will always require more memory than non-compositing, if both are done right.
Originally Posted by liam
I'm not really sure how many apps can the Raspberry Pi have opened at the same time, or how much memory it has, but with compositing you will use more memory for graphics as you open more apps and without compositing you won't. And that's why in Windows versions older than Vista and in X without compositing you get white zones when an app hangs up and you move another window through its surface, because it doesn't have a copy of the contents and it has to render it again, but the app is locked and can't tell what needs to be inside the window. In compositing, you will have at least n+1 surfaces (which consume memory), where n is the number of apps you have opened, and in non-compositing you will have only one surface with the the dimensions of your screen (in some cases, horizontal length is rounded to the next power of two and the excess is just ignored, I'm not sure why, but it's that way in some cases).
What Ericg said (leaving aside what the toolkits said, since Canonical said already they'll be porting the toolkits if upstream doesn't do that, and Qt is needed by Unity IIRC, so they'll be porting that at the very least) is valid, except for window managers. They need to explicitly have a backend for every display system they need to support, because they *need* to talk directly to that. That's why on the KDE side there are problems with Kwin, but nobody said anything about the other pieces of that software distribution.
Originally Posted by Candide
The downside of that approach would be a bigger image (thus, a longer download time) for packages you are not likely to use (usually, a given user is comfortable with a given desktop, and just ignores the others). Aside from the packaging of everything in the same image (and the inability to leverage the *buntu name to gather new users, but this wouldn't affect the current user base in the least), I see nothing too problematic on your suggestion.
Originally Posted by Thaodan
Both LXDE and XFCE are unable to run on pure Mir, and running on top of XMir wouldn't avoid any of the problems it has in X. At most, if the implementation of this layer is more efficient (which is not likely, because nobody put emphasis on X*whatever* development aside from giving the ability of running X apps) than pure X and Mir is non-intrusive enough, you'd have a slightly better performance. But that's all the benefits you *could* have, and none of them is likely. The best I'd expect is the same or a wee bit worse performance, and everything else behaving just the same than in X.org.
Originally Posted by bwat47
EDIT: Anyway, if not using compositing, tearing is expectable because it redraws everything at every frame. XMir have an independent X server inside. If this X server doesn't have compositing enabled, you will have exactly the same problem. If it's enabled, you will not face those problem, but you won't if you enable compositing on X either (Openbox doesn't support compositing, so the closest you can get is using Fluxbox git, which has got compositing since 2012 or 2011).
On the memory consumption problem, when compositing bypassing (on Mir) gets implemented, Lubuntu will have the memory consumption of two surfaces with equal (or corrected to next power of two, as I said earlier) dimensions as the screen, one for the root window and one for the full-screen X server, plus whatever the X server and the Mir server take by themselves, but if compositing in X is disabled you will not see extra surfaces memory consumption. So, when that happens I won't expect memory to go up more than a few MBs when running on XMir compared to running on pure X.org.
Last edited by mrugiero; 06-30-2013 at 04:07 AM.
Why exactly is it that Wayland *requires* composition? The way I understand, a Wayland compositor simply receives the buffers from the client, then combines them into one buffer, and passes this buffer on to the system compositor / writes it directly to screen via the video drivers. Usually this combining is done by composition, but is there something in the protocol that mandates it has to be done so?
Originally Posted by mrugiero
If I get it correctly, except for full-screen apps, Wayland mandates composition. Is required for the 'every frame is perfect' policy. If you do otherwise, you will probably observe the same artifacts that were mentioned for X without compositing.
Originally Posted by dee.
It's important to notice compositing != special effects. This combination of buffers is actually the compositing, and it requires more memory because those buffers need to exist first to be combined. As the engineer with the j nickname (I can't remember the full name because it's unpronounceable in my mother tongue) said, when using a full-screen app you just point to the buffer where the contents are being drawn, so there's no compositing involved. I meant for the common desktop usage, where use of windows is the norm (and is actually more likely to be so on lower-end desktop computers, since most full-screen apps targeted for desktop are games and videos, which would be harder to use on lower-end computers).
As I already said, I don't know how the RPi backend is supposed to work. If it's like in most tablets and smartphones, with a single app taking full-screen all the time, then you can safely disable compositing for Wayland. But you can't for desktop, and this means a greater memory usage.
Oh, so you mean the buffers themselves are what cause the greater memory usage. Gotcha.
Originally Posted by mrugiero
Of course. AFAIK, Wayland itself is a lot lighter than X.org.
Originally Posted by dee.
Qt's official stance is that since Qt 5.0 all platform adaptions are plugins anyway. In the past Qt itself had to be modified to support other display servers but the Lighthouse project added the plugin interface to Qt 4.x and now Qt 5.x solely uses that plugin interface. Canonical wrote a preliminary plugin: https://launchpad.net/qmir
Originally Posted by Ibidem
I'm not aware that Canonical made any attempt to upstream that plugin to Qt Project.
However in any case at least regarding pure Qt applications there should be no problem running the same binary under a Wayland or a Mir system. Normal applications should not talk to the display server directly. Usually only window managers do that.