Page 2 of 3 FirstFirst 123 LastLast
Results 11 to 20 of 23

Thread: Canonical Starts Work On Mir Multi-Monitor Handling

  1. #11
    Join Date
    Dec 2010
    Location
    MA, USA
    Posts
    1,209

    Default

    Quote Originally Posted by mrugiero View Post
    I'm wondering something that is not really important, but makes me curious.
    When dragging an app to the edge of a given monitor, to another, will it "jump" to the other screen, or will it display partially in both screens? This seems 100% trivial, but I can see how in a slower card this could lead to images showing a lack of sync (a screen will probably update before the other, and then one will have the previous state of the app displaying instead of the current state, for a fraction of second, while the other is already up-to-date). Just jumping would avoid this kind of artifact.

    EDIT: I'm asking more in general, not specifically Mir's case.
    Typically it splits between both screens, much in the same way as if you to do the 3D cube in compiz and you put a window in between 2 workspaces. Generally the sync issues aren't that noticeable, especially if both of your screens are different sizes/resolutions.

  2. #12
    Join Date
    Feb 2011
    Posts
    1,067

    Default

    Quote Originally Posted by AJenbo View Post
    The only down side I see is print screen would have to stitch together the frame bufferers in to a single image.
    I would say anything that makes common tasks easier to the same extent it makes uncommon tasks harder is a win. In this case it makes common tasks easier to more of an extent than it makes uncommon tasks harder (and that is ignoring people who only want a screenshot of one screen).

  3. #13
    Join Date
    May 2011
    Posts
    1,442

    Default

    Quote Originally Posted by schmidtbag View Post
    Multi-monitor setups are, IMO, only good if you have a low-wattage GPU with non-gaming purposes in mind, or, if you have a rig strictly used for gaming and nothing else. I have 2 HD5750s in crossfire and I was amazed how much heat the primary GPU generated when not doing anything graphically intensive. In this particular GPU anyway, it has 7 different clock settings in it's BIOS. I tried underclocking the settings that seemed to be linked specifically to multi-monitors and surprisingly, just a 100mhz difference in the memory can make the 2nd display unstable. Windows didn't seem to care as much about re-clocking. I'm guessing if I want to try multi-monitor again, I'm going to have to either ditch KDE for something not composited, or, hope that mir or wayland will allow me to reclock my GPU. Considering the relationship between KDE and mir, that reduces my options.
    Get an NVIDIA card.

  4. #14
    Join Date
    Dec 2010
    Location
    MA, USA
    Posts
    1,209

    Default

    Quote Originally Posted by johnc View Post
    Get an NVIDIA card.
    I'm 90% sure nvidia suffers from the same problem. Any GPU will need to clock itself higher to drive more displays.

  5. #15
    Join Date
    May 2011
    Posts
    1,442

    Default

    Quote Originally Posted by schmidtbag View Post
    I'm 90% sure nvidia suffers from the same problem. Any GPU will need to clock itself higher to drive more displays.
    I have two displays running on the Unity desktop and my GTX 570 downclocks to 50/135/101 MHz graphics / memory / processor clocks. Maybe it's different for a CF / SLI setup though.

  6. #16
    Join Date
    Sep 2011
    Posts
    680

    Default

    Quote Originally Posted by johnc View Post
    I have two displays running on the Unity desktop and my GTX 570 downclocks to 50/135/101 MHz graphics / memory / processor clocks. Maybe it's different for a CF / SLI setup though.
    as mentioned i get the same bad result with GT210 as HD5xxx, so Nvidia isn't a magic bullet. At home though with, smaller monitors, things works grate on my GTX460se. My laptop with HD6370m also works grate with two monitors.

  7. #17
    Join Date
    May 2011
    Posts
    1,442

    Default

    Quote Originally Posted by AJenbo View Post
    as mentioned i get the same bad result with GT210 as HD5xxx, so Nvidia isn't a magic bullet. At home though with, smaller monitors, things works grate on my GTX460se. My laptop with HD6370m also works grate with two monitors.
    I get good performance with two monitors on a GT210 but that is with GNOME2 + compiz. I can't speak to the newer compositors.

  8. #18
    Join Date
    Sep 2011
    Posts
    680

    Default

    Quote Originally Posted by johnc View Post
    I get good performance with two monitors on a GT210 but that is with GNOME2 + compiz. I can't speak to the newer compositors.
    The one where i get poor performance is about 3kx2k resolution, don't know if that has any thing to say.

  9. #19
    Join Date
    May 2011
    Posts
    1,442

    Default

    Quote Originally Posted by AJenbo View Post
    The one where i get poor performance is about 3kx2k resolution, don't know if that has any thing to say.
    Interesting. I'm using a pair of 1080p monitors. What DE are you using?

  10. #20
    Join Date
    Mar 2012
    Posts
    81

    Default Incoherency

    Quote Originally Posted by dh04000
    I hope to see this done well. I haven't had a positive multi-desktop experience in linux on any of the DE's on any distros.
    Quote Originally Posted by Ericg View Post
    Thats X's fault. Granted libkscreen aims to fix that. Give KDE 4.11 a try when it comes out, or the latest release of Fedora KDE (they shipped it ahead of time)
    Ericg your message is self-contradictory: if it is X fault, how KDE can fix it as it is still running on X?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •