Announcement

Collapse
No announcement yet.

Ubuntu's Mir Finally Supports Drag & Drop

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • linuxgeex
    replied
    Originally posted by Delgarde View Post

    And yet "speed to market" was one of the reasons Canonical cited for deciding to do it their own way... apparently under the misapprehension that things would go quicker if they started from scratch, and tried to build everything themselves with an under-resourced team and a shortage of relevant skills. Funny how these things play out, isn't it?
    I don't remember speed to market being their agenda. I remember it being pure NIH - they wanted something that worked the way they wanted it to, rather than joining the project which worked the closest to how they wanted it to work and then working to improve that.

    I think they sorely underestimated the complexity of the task of developing a display manager which must run on a broad range of rapidly evolving display/sound/input hardware, even with the kernel being responsible for the low level drivers. Then their own compositor, input event handling (this is way trickier than it sounds), window manager... an entire vertical stack with multiple teams working with interdependencies. They were, frankly, crazy to take it on. There's a good reason X11 has been around so long, chief among them being that it's almost as hard to replace as the linux VT, lol. They would have done very well to spend at least a year contributing to all levels of Wayland/Weston/libinput before they jumped into that abyss. Perhaps something less challenging like writing a replacement for linux itself, lol.
    Last edited by linuxgeex; 15 April 2017, 06:12 AM.

    Leave a comment:


  • Mateus Felipe
    replied
    Originally posted by leipero View Post

    For me, those are not rational reasons, wheel is also old, very old, that doesn't make it obsolete tho, X is still functional, can't say teh same for otehr protocols .
    I know "old" doesn't means obsolete. This is why I said old AND obsolete. Anyway, how I said, stableness isn't quality. X is stable, broadly used and functional. But not good.

    Leave a comment:


  • linuxgeex
    replied
    Originally posted by Delgarde View Post

    And yet "speed to market" was one of the reasons Canonical cited for deciding to do it their own way... apparently under the misapprehension that things would go quicker if they started from scratch, and tried to build everything themselves with an under-resourced team and a shortage of relevant skills. Funny how these things play out, isn't it?
    Yes, exactly. They looked at a project which looked like it was going nowhere slowly (5 years in development and still not dogfoodable) and figured they couldn't do worse. But they did, in fact do worse... because if they had forked Wayland and taken it their own direction to speed it up (like Google did for ChromeOS) then they would have been much further ahead today... and the market would perhaps be less fragmented.

    Leave a comment:


  • TheBlackCat
    replied
    Originally posted by leipero View Post
    Ok, that makes sense. But still, it introduce whole range of problems I've already mentioned for DE developement.
    I understand, but my point is that DE developers already have those problems under X11. Essentially everything you have to do under Wayland you also have to do under X11. The difference is that Wayland requires you to do a lot less, and the stuff it does require you to do is generally done much more cleanly and sanely.

    Originally posted by leipero View Post
    for example, i want to have v-sync off on every PC i use, default option is on, now, instead of global change, I have to rely on DE implementation,
    vsync is irrelevant on Wayland, it is only an issue because of the limitations of X11. Under X11, applications have no way to synchronize their timing with the timing of the display on a frame-by-frame basis. On Wayland they do so vsync isn't needed. This sort of thing is only possible because the compositor is also in charge of managing the displays. So Wayland helps you in this case.

    But overall, yes, if your DE doesn't provide the features you need you will have problems. But overall, you should come out ahead under Wayland because it eliminates the need for many workarounds for X11.
    Last edited by TheBlackCat; 23 March 2017, 02:26 PM.

    Leave a comment:


  • leipero
    replied
    Originally posted by TheBlackCat View Post
    DEs already have to be aware of it. Most distros ship without an xorg.conf by default, handling everything at runtime. So a DE already needs to be able to deal with that situation under X11. The advantage is that DEs don't have to worry about multiple conflicting settings that it has to somehow decide between. Add another configuration tool like NVidia's settings and you have a whole mess of conflicts to wade through.

    Wayland solves this by taking the screen management out of the display server completely. Instead of having your compositor and X11 both having to have their own duplicate version of the displays which have to keep synchronized with each other (which is inherently unreliable in an asynchronous framework like X11) and any number of random configuration tools potentially giving conflicting orders, the compositor is solely responsible for laying out the screens, and the display server is only responsible for actually putting the resulting images on the display. There are now kernel interfaces that allow the compositor to set the screens directly, which isn't possible under X11 because both the X11 and compositor screens need to be kept in agreement.

    So Wayland isn't doing anything new, the entire architecture includes stuff already necessary under X11 for modern DEs. But it removes some legacy stuff that was a source of wasted resources and conflicts.
    Ok, that makes sense. But still, it introduce whole range of problems I've already mentioned for DE developement. Ofc., it is not fault of X11/wayland/MiR developers that manufacturers do not follow basic specifications and set wrong EDID, but even if that is the 100% case, and everyone follows it, you still have problems of poor implementation of user-configuration, for example, i want to have v-sync off on every PC i use, default option is on, now, instead of global change, I have to rely on DE implementation, and you may wonder why would anyone want to do such things, but there are people who want it, and using v-sync on 100Hz+ (or even 85Hz) is unreasonable, who would want that? In those cases, you want FPS to be at level of refresh rate, but when it dips (and it will, especially on 120Hz+ displays) you do not want to see performance degradation, so for example, if it dips down to 92FPS, you wouldn't even notice it with v-sync off, with option enabled, even with tripple buffer, you will see visible stutter. And, as we move one,t ehre would be less and less 60Hz displays in future..., and tons of annoyed users.

    I did not actually tried wayland in a while, but last time i did, there was no chance to load custom EDID (it was some bug, idk) and disable vblank globaly. I might as well try it latter, sicne i already ahve it installed and have option under GDM.

    Leave a comment:


  • TheBlackCat
    replied
    Originally posted by Sonadow View Post
    The downside is that this breaks desktop recording software which rely on a 'universal' way to grab and record screen output like SSR and OBS. How can a protocol be enough to let such applications know what or how they can record display output? Is thing all going to be compositor specific now? Because that is just ridiculous IMO.
    They seem to be working on a secure screenshot API for Wayland, although the specifics apparently have not been finalized yet. The idea would be for users to have to explicitly grant permission for an application to take screenshots, and then the compositor would collect the appropriate image data and send it to the application in a consistent way.

    Leave a comment:


  • TheBlackCat
    replied
    Originally posted by leipero View Post
    I disagree. Display server SHOULD handle such configuration, because it is more resonable, and it doesn't require from window manager (or DE) to be aware of it, further making less complicated code and any potential change in such configuration would not affect DE/WM that might be unmaintained at some point.
    DEs already have to be aware of it. Most distros ship without an xorg.conf by default, handling everything at runtime. So a DE already needs to be able to deal with that situation under X11. The advantage is that DEs don't have to worry about multiple conflicting settings that it has to somehow decide between. Add another configuration tool like NVidia's settings and you have a whole mess of conflicts to wade through.

    Wayland solves this by taking the screen management out of the display server completely. Instead of having your compositor and X11 both having to have their own duplicate version of the displays which have to keep synchronized with each other (which is inherently unreliable in an asynchronous framework like X11) and any number of random configuration tools potentially giving conflicting orders, the compositor is solely responsible for laying out the screens, and the display server is only responsible for actually putting the resulting images on the display. There are now kernel interfaces that allow the compositor to set the screens directly, which isn't possible under X11 because both the X11 and compositor screens need to be kept in agreement.

    So Wayland isn't doing anything new, the entire architecture includes stuff already necessary under X11 for modern DEs. But it removes some legacy stuff that was a source of wasted resources and conflicts.

    Leave a comment:


  • Sonadow
    replied
    Originally posted by TheBlackCat View Post
    No, it isn't universal, nor is it meant to be. The display server shouldn't be handling configuration like that, that is the sort of the desktop environment should be (and is) in charge of. Dealing with all the different ways X11 could be configured, and conflicts between them, is a major problem.

    My point was that your statement about Wayland requiring you to "build your own EDID binary" is wrong. Wayland provides the APIs necessary to change the display settings in an even better and more reliable way than X11, and it looks like Weston, KDE Plasma, and GNOME now all provide interfaces to configure them.

    What about the rest of what I wrote? You responded to 4 words out of the several paragraphs I wrote.
    The downside is that this breaks desktop recording software which rely on a 'universal' way to grab and record screen output like SSR and OBS. How can a protocol be enough to let such applications know what or how they can record display output? Is thing all going to be compositor specific now? Because that is just ridiculous IMO.

    And as I have said before, the creator of OBS has more than once publicly stated that he will not add DE or compositor-specific stuff into OBS. So we have to live without high-quality desktop recording under Wayland?

    Leave a comment:


  • leipero
    replied
    Originally posted by TheBlackCat View Post
    No, it isn't universal, nor is it meant to be. The display server shouldn't be handling configuration like that, that is the sort of the desktop environment should be (and is) in charge of. Dealing with all the different ways X11 could be configured, and conflicts between them, is a major problem.

    My point was that your statement about Wayland requiring you to "build your own EDID binary" is wrong. Wayland provides the APIs necessary to change the display settings in an even better and more reliable way than X11, and it looks like Weston, KDE Plasma, and GNOME now all provide interfaces to configure them.

    What about the rest of what I wrote? You responded to 4 words out of the several paragraphs I wrote.
    I disagree. Display server SHOULD handle such configuration, because it is more resonable, and it doesn't require from window manager (or DE) to be aware of it, further making less complicated code and any potential change in such configuration would not affect DE/WM that might be unmaintained at some point. So no, that's terrible idea, and whoever came up with such viewpoint doesn't look far enough in the future. But you are entitled for your own opinion.

    Also to add, display managares (login screens such as GDM etc.) would rely on that information, so in esence, display server not handling those configurations complicate things for everyone, and benefits absolutely nothing.

    I aknowledged your reasoning, and i liked your post to point out that for you (obviously you did not see it), I can agree or disagree with your points, but since I wanted rational reasons, and since you gave one, I am satisfied with answer, and personal viewpoint (agreement/disagreement) is irrelevant.
    Last edited by leipero; 22 March 2017, 06:42 PM.

    Leave a comment:


  • TheBlackCat
    replied
    Originally posted by leipero View Post
    That doesn't work, it is not universal setting like X modelines. Run GNOME 3 on wayland and try to use non-EDID mode.
    No, it isn't universal, nor is it meant to be. The display server shouldn't be handling configuration like that, that is the sort of the desktop environment should be (and is) in charge of. Dealing with all the different ways X11 could be configured, and conflicts between them, is a major problem.

    My point was that your statement about Wayland requiring you to "build your own EDID binary" is wrong. Wayland provides the APIs necessary to change the display settings in an even better and more reliable way than X11, and it looks like Weston, KDE Plasma, and GNOME now all provide interfaces to configure them.

    What about the rest of what I wrote? You responded to 4 words out of the several paragraphs I wrote.

    Leave a comment:

Working...
X