Page 1 of 3 123 LastLast
Results 1 to 10 of 23

Thread: Canonical Starts Work On Mir Multi-Monitor Handling

  1. #1
    Join Date
    Jan 2007
    Posts
    15,383

    Default Canonical Starts Work On Mir Multi-Monitor Handling

    Phoronix: Canonical Starts Work On Mir Multi-Monitor Handling

    One of the feature limitations of using the Mir Display Server up to this point has been when using multiple monitors (or say a laptop connected to a projector) the only display configuration possibility is using a cloned mode whereby the screens are the same. Canonical's Mir developers have begun working on improved multi-monitor handling...

    http://www.phoronix.com/vr.php?view=MTQyNTI

  2. #2
    Join Date
    Aug 2012
    Posts
    509

    Default

    I hope to see this done well. I haven't had a positive multi-desktop experience in linux on any of the DE's on any distros.

  3. #3
    Join Date
    Sep 2011
    Posts
    707

    Default

    The part i was most worried about was if they where going to have a surface per monitor or not, but it seams that they are doing it the right way, and even having it tunable witch i hadn't expected.

  4. #4
    Join Date
    Aug 2012
    Posts
    509

    Default

    Quote Originally Posted by AJenbo View Post
    The part i was most worried about was if they where going to have a surface per monitor or not, but it seams that they are doing it the right way, and even having it tunable witch i hadn't expected.
    So explain this to me like I'm an 8 year old. What does that mean? Whats the right way vs wrong way? I'm not challenging you, I'm just curious.
    Last edited by dh04000; 07-31-2013 at 10:33 AM.

  5. #5
    Join Date
    Sep 2011
    Posts
    707

    Default

    Quote Originally Posted by dh04000 View Post
    So explain this to me like I'm an 8 year old. What does that mean? Whats the right way vs wrong way? I'm not challenging you, I'm just curious.
    With a single frame buffer you quickly hit hardware limitation because the combined size is larger then what the hardware can handle. For my set up of just 2 monitors it fails when using HD5xxx with Catalyst, though I works with Radeon. But performance is very poor. I tried switching to a GT210 but got the same poor performance.

    From the Wayland mailing list:
    Quote Originally Posted by Alex Deucher
    The sensible way to handle this is one surface per crtc otherwise we run into the same problems we hit with X where a multi-head desktop is too wide for the render/texture limits of the hardware.
    You often also also end up with a lot of waisted texture allocation if you monitors aren't the same size, orientation or alignment.

    I also imagine that there is a benefit to have by being able to sent the frame bufferer directly to the displays instead of having to slice them out of a larger frame buffer.

    You also don't have to reallocate the frame buffer when reconfiguring the displays.

    Lastly it should also simplify configuring the desktop background correctly and individually per display.

    The only down side I see is print screen would have to stitch together the frame bufferers in to a single image.

  6. #6
    Join Date
    Jan 2010
    Posts
    367

    Default

    Quote Originally Posted by AJenbo View Post
    With a single frame buffer you quickly hit hardware limitation because the combined size is larger then what the hardware can handle. For my set up of just 2 monitors it fails when using HD5xxx with Catalyst, though I works with Radeon. But performance is very poor. I tried switching to a GT210 but got the same poor performance.
    I can't imagine this is still a problem with somewhat modern hardware. Modern GPUs can handle colorbuffers and textures of 16K size. That is plenty even if you want to attach multiple 4K monitors. And if your hardware isn't completely ancient, it will be able to handle at least 8K, which should still be enough for all common cases. Memory consumption can be an issue, but separate colorbuffers cannot fully solve that problem either; if you don't have sufficient amounts of VRAM (and bandwidth) setups with multiple monitors are always going to suffer. However, 2 GB VRAM are quickly becoming the standard amount, so this shouldn't be a problem in practice.

  7. #7
    Join Date
    Oct 2010
    Posts
    94

    Default

    Quote Originally Posted by dh04000 View Post
    So explain this to me like I'm an 8 year old. What does that mean? Whats the right way vs wrong way? I'm not challenging you, I'm just curious.
    Essentially, he's saying that each monitor/display connected has its own screen as opposed to having one big-ass desktop spread out over multiple monitors. This is the right way to do it because then you can have independent resolutions that better fit each display. I'm sure there's a ton of under the hood benefits as well.

  8. #8
    Join Date
    Dec 2010
    Location
    MA, USA
    Posts
    1,440

    Default

    Multi-monitor setups are, IMO, only good if you have a low-wattage GPU with non-gaming purposes in mind, or, if you have a rig strictly used for gaming and nothing else. I have 2 HD5750s in crossfire and I was amazed how much heat the primary GPU generated when not doing anything graphically intensive. In this particular GPU anyway, it has 7 different clock settings in it's BIOS. I tried underclocking the settings that seemed to be linked specifically to multi-monitors and surprisingly, just a 100mhz difference in the memory can make the 2nd display unstable. Windows didn't seem to care as much about re-clocking. I'm guessing if I want to try multi-monitor again, I'm going to have to either ditch KDE for something not composited, or, hope that mir or wayland will allow me to reclock my GPU. Considering the relationship between KDE and mir, that reduces my options.

  9. #9
    Join Date
    Jan 2011
    Posts
    1,287

    Default

    I'm wondering something that is not really important, but makes me curious.
    When dragging an app to the edge of a given monitor, to another, will it "jump" to the other screen, or will it display partially in both screens? This seems 100% trivial, but I can see how in a slower card this could lead to images showing a lack of sync (a screen will probably update before the other, and then one will have the previous state of the app displaying instead of the current state, for a fraction of second, while the other is already up-to-date). Just jumping would avoid this kind of artifact.

    EDIT: I'm asking more in general, not specifically Mir's case.

  10. #10
    Join Date
    Aug 2012
    Location
    Pennsylvania, United States
    Posts
    1,919

    Default

    Quote Originally Posted by dh04000 View Post
    I hope to see this done well. I haven't had a positive multi-desktop experience in linux on any of the DE's on any distros.
    Thats X's fault. Granted libkscreen aims to fix that. Give KDE 4.11 a try when it comes out, or the latest release of Fedora KDE (they shipped it ahead of time)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •