No announcement yet.

xrandr 1.4 multi-gpu works!

  • Filter
  • Time
  • Show
Clear All
new posts

  • xrandr 1.4 multi-gpu works!

    Today I discovered that a fundamental (well, for me) milestone in the linux desktop world has been achieved: with xrandr 1.4 it is effectively possible to seamlessly use multiple monitors attached to DIFFERENT VIDEO CARDS, with different resolutions, each with its own desktop, with the ability to move windows between screens AND full 3d acceleration on all the screens. All this with the open drivers.

    I wanted to share this as information on this topic is nearly not-existent (all the articles I could find were about rendering offloading).

    I tested it with a clean kubuntu 13.04 image, with 2 GTX 285. KDE detected the two screens without any configuration, and put a desktop on each one. I was able to run Unigine heaven 2.5 at high settings, windowed (1024x768) and move the window between the screens while it was running (15-30 fps). Simply amazing.

  • #2

    I need to test on my PC with integrated Radeon 4250 and pci-express nvidia geforce GTS250.
    I was already able to do GPU offloading from radeon to geforce while testing on Fedora test day.


    • #3
      I can confirm it works.

      My configuration:
      Motherboard ASRock 880GMH (integrated Radon 4250)
      Pcie Nvidia Geforce GTX250
      (CPU: AMD X4 635)

      I need to default integrated GPU set in BIOS!
      (if I set default GPU to pcie, integrated is not visible trough lspci)

      Using Fedora 19 (pre-stable) I get out of box multimonitor desktop (VGA on integrated radeon, DVI on pci geforce) using opensource driver. IIRC it does not work with prop driver.
      Default rendering GPU is radeon. I can offload render using DRI_PRIME=1 to run opengl game on nvidia (which is much faster).

      Unfortunately DRI_PRIME can't work with prop driver due to licence issues.


      • #4
        I would just like to add some screenshots. I was also able to connect 3 monitors to PC (2 to IGD and 1 to DIS)

        here are images:


        • #5
          How does one configure multi-gpu with RandR 1.4? Are there some guides how to do it available somewhere?


          • #6
            The options are described in the xrandr manpage, in the section "RandR version 1.4 options"
            $ xrandr --listproviders
            would be a start.

            Unfortunately there are no examples at the bottom, but it is not difficult to figure out.


            • #7
              I followed the guide from

              My xorg.conf:

              That configuration starts up nicely.

              xrandr -q:
              xrandr --listproviders:

              Then I type: xrandr --setprovideroutputsource modesetting radeon

              xrandr -q now returns:

              Now when I type "xrandr --auto" X crashes and I'm taken back to the KDE login screen.

              I also tried to activate monitors plugged on Intel GPU using the KDE graphical utility but that causes the same effect: X crashes and I'm taken back to the KDE login screen.

              I have ATI HD7979 and integrated intel:
              03:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Tahiti XT [Radeon HD 7970]
              00:02.0 Display controller: Intel Corporation Xeon E3-1200 v2/3rd Gen Core processor Graphics Controller (rev 09)

              Kernel is 3.10-3-amd64 and distro is Debian "jessie".
              xrandr version is 1.4.1
              xorg versions:

              Any ideas?


              • #8
                Originally posted by azyr View Post
                Any ideas?
                yes -- you're following a guide that does not pertain to you.

                Blow away all xorg configurations you did. They are not needed. You should be letting X autoconfig itself. A kink that you may face is that your radeon adapter is an GCN/SI arch. When using that adapter with the OSS driver stack, getting the best performance that you can requires using glamor. With a single AMD adapter (SI or otherwise), glamor works. With two AMD adapters (SI+SI, SI+otherwise, or otherwise+otherwise), glamor borks. (See this thread: ... note: I use openSUSE, so this is not something particular to Ubuntu). Given that, I have no idea whether intel+SI(glamor) will work or not. If it does not work for you, then you should add your feedback to that existing bug report to indicate that the problem is not particular to two AMD adapters utilizing glamor, but rather glamor itself when it comes to the use of two adapters (pure AMD or mixed configs).

                Anyway, (assuming that your system meets the requirements (these are listed in the xrandr --help under the provider objects section) <-- your's should; that was just a general disclaimer for any other reader who might blindly follow), all you need to do is boot up as per usual and:
                xrandr --setprovideroutputsource 1 0
                Then, after that, turn on the output for the second adapter -- you can do that via xrandr or one of the gui's (I suggest you use kscreen as it will be obvious what to do, though any of the other RANDR frontends will work too, albeit just not as intuitively as it is with kscreen). After that, you are left on your own as to how to make that config persistent across boots without any further intervention on your part (I added it as a start up script used by the Display manager, but there are ninety-four gazillion other ways you can accomplish it).


                • #9
                  Originally posted by Tyler_K View Post
                  I have no idea whether intel+SI(glamor) will work or not.
                  scratch that. I just saw a very recent xorg log of such a config, and it appears to work fine. So the problem I alluded seems to be either confined to using 2 AMD adapters with glamor, or maybe just two adapters using glamor.


                  • #10
                    Well, I am having problems myself with a configuration similar to that of the OP... only it is with two ATI video cards instead of nvidia.

                    I am runing opensuse 13.1 in a machine with a PCIe "main" video card (ATI 6570, Turks) and a PCI "secondary" card (ATI 5430, Cedar). I am using the open source "radeon" driver for both. The main card has two monitors attached (DVI), and the secondary card a third DVI monitor. After multiple trial and error, it turns out that Tyler's recommendation was the simplest: remove xorg.conf and let the autoconfiguration do its thing. One small nag is that initially only the main video card is detected and configured, so I start with a dual-monitor setup "only". Kscreen sees both monitors, which incidentally have a different resolution, and lets me place them side-by-side, on top of each other... whatever I want. It works wonderfully. I have a big desktop, can move windows form one monitor to the other, desktop effects are enabled (without Xinerama!)... up to this point, it works great.

                    The next step is detecting the third monitor... easy enough, "xrandr -q" only shows the two ones attached to the main card but after running "xrandr --setprovideroutputsource 1 0 " as stated in another post (and having previously verified that there are indeed two providers seen) it will show the second videocard and its outputs, one of them connected to a monitor. Excellent! (just a side comment: I can skip this by changing the "boot" video card in my BIOS from PCIe to PCI; if I do this, everything behaves the same way once I enter the X server but the PCI card/third monitor are already being detected).

                    To set up the third monitor I go to kscreen, which is now showing it (initially greyed, as in disabled). Place it side-by-side with the other ones, enable it, and voila!, the third monitor comes to life as it should. However, as soon as I put something (even if it is just the mouse pointer) inside that third workspace, the whole X server chokes and slows down to the point of not being usable (it takes ~ 5 seconds to register a mouse click, to give you an idea). Even if I now move the mouse back to the two original monitors the server is already choked to death and does not recover. Btw, this happens only 1 out of every 4 or 5 times... the other 3-4 times X simply crashes and goes back to the login screen upon activation of the third monitor.

                    When the slowdown happens I don't see anything strange in the logs (messages or Xorg.0.log), and there is no additional CPU load, so I don't really know how to proceed. I have checked in the logs that "glamoregl" is being loaded for the cards, so it may be true that so far glamor does not play nice with a two-card ATI setup.

                    If anybody has suggestions on how to try and diagnose this, I would really appreciate them.