Announcement

Collapse
No announcement yet.

Canonical Starts Work On Mir Multi-Monitor Handling

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Canonical Starts Work On Mir Multi-Monitor Handling

    Phoronix: Canonical Starts Work On Mir Multi-Monitor Handling

    One of the feature limitations of using the Mir Display Server up to this point has been when using multiple monitors (or say a laptop connected to a projector) the only display configuration possibility is using a cloned mode whereby the screens are the same. Canonical's Mir developers have begun working on improved multi-monitor handling...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    I hope to see this done well. I haven't had a positive multi-desktop experience in linux on any of the DE's on any distros.

    Comment


    • #3
      The part i was most worried about was if they where going to have a surface per monitor or not, but it seams that they are doing it the right way, and even having it tunable witch i hadn't expected.

      Comment


      • #4
        Originally posted by AJenbo View Post
        The part i was most worried about was if they where going to have a surface per monitor or not, but it seams that they are doing it the right way, and even having it tunable witch i hadn't expected.
        So explain this to me like I'm an 8 year old. What does that mean? Whats the right way vs wrong way? I'm not challenging you, I'm just curious.
        Last edited by dh04000; 31 July 2013, 09:33 AM.

        Comment


        • #5
          Originally posted by dh04000 View Post
          So explain this to me like I'm an 8 year old. What does that mean? Whats the right way vs wrong way? I'm not challenging you, I'm just curious.
          With a single frame buffer you quickly hit hardware limitation because the combined size is larger then what the hardware can handle. For my set up of just 2 monitors it fails when using HD5xxx with Catalyst, though I works with Radeon. But performance is very poor. I tried switching to a GT210 but got the same poor performance.

          From the Wayland mailing list:
          Originally posted by Alex Deucher
          The sensible way to handle this is one surface per crtc otherwise we run into the same problems we hit with X where a multi-head desktop is too wide for the render/texture limits of the hardware.
          You often also also end up with a lot of waisted texture allocation if you monitors aren't the same size, orientation or alignment.

          I also imagine that there is a benefit to have by being able to sent the frame bufferer directly to the displays instead of having to slice them out of a larger frame buffer.

          You also don't have to reallocate the frame buffer when reconfiguring the displays.

          Lastly it should also simplify configuring the desktop background correctly and individually per display.

          The only down side I see is print screen would have to stitch together the frame bufferers in to a single image.

          Comment


          • #6
            Originally posted by AJenbo View Post
            With a single frame buffer you quickly hit hardware limitation because the combined size is larger then what the hardware can handle. For my set up of just 2 monitors it fails when using HD5xxx with Catalyst, though I works with Radeon. But performance is very poor. I tried switching to a GT210 but got the same poor performance.
            I can't imagine this is still a problem with somewhat modern hardware. Modern GPUs can handle colorbuffers and textures of 16K size. That is plenty even if you want to attach multiple 4K monitors. And if your hardware isn't completely ancient, it will be able to handle at least 8K, which should still be enough for all common cases. Memory consumption can be an issue, but separate colorbuffers cannot fully solve that problem either; if you don't have sufficient amounts of VRAM (and bandwidth) setups with multiple monitors are always going to suffer. However, 2 GB VRAM are quickly becoming the standard amount, so this shouldn't be a problem in practice.

            Comment


            • #7
              Originally posted by dh04000 View Post
              So explain this to me like I'm an 8 year old. What does that mean? Whats the right way vs wrong way? I'm not challenging you, I'm just curious.
              Essentially, he's saying that each monitor/display connected has its own screen as opposed to having one big-ass desktop spread out over multiple monitors. This is the right way to do it because then you can have independent resolutions that better fit each display. I'm sure there's a ton of under the hood benefits as well.

              Comment


              • #8
                Multi-monitor setups are, IMO, only good if you have a low-wattage GPU with non-gaming purposes in mind, or, if you have a rig strictly used for gaming and nothing else. I have 2 HD5750s in crossfire and I was amazed how much heat the primary GPU generated when not doing anything graphically intensive. In this particular GPU anyway, it has 7 different clock settings in it's BIOS. I tried underclocking the settings that seemed to be linked specifically to multi-monitors and surprisingly, just a 100mhz difference in the memory can make the 2nd display unstable. Windows didn't seem to care as much about re-clocking. I'm guessing if I want to try multi-monitor again, I'm going to have to either ditch KDE for something not composited, or, hope that mir or wayland will allow me to reclock my GPU. Considering the relationship between KDE and mir, that reduces my options.

                Comment


                • #9
                  I'm wondering something that is not really important, but makes me curious.
                  When dragging an app to the edge of a given monitor, to another, will it "jump" to the other screen, or will it display partially in both screens? This seems 100% trivial, but I can see how in a slower card this could lead to images showing a lack of sync (a screen will probably update before the other, and then one will have the previous state of the app displaying instead of the current state, for a fraction of second, while the other is already up-to-date). Just jumping would avoid this kind of artifact.

                  EDIT: I'm asking more in general, not specifically Mir's case.

                  Comment


                  • #10
                    Originally posted by dh04000 View Post
                    I hope to see this done well. I haven't had a positive multi-desktop experience in linux on any of the DE's on any distros.
                    Thats X's fault. Granted libkscreen aims to fix that. Give KDE 4.11 a try when it comes out, or the latest release of Fedora KDE (they shipped it ahead of time)
                    All opinions are my own not those of my employer if you know who they are.

                    Comment

                    Working...
                    X