Announcement

Collapse
No announcement yet.

xrandr 1.4 multi-gpu works!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • stoatwblr
    replied
    Originally posted by chithanh View Post
    That is your hardware, but what is your software and configuration? Also if you have a driver section in xorg.conf, then you need one for each of your GPUs. (I suggest to have no xorg.conf at all.)
    DQ35JO board, Q6600, 8Gb / Ubuntu 14.04/kernel 3.13.0-[12-21]/ ZFSonRoot/xorg-edgers daily builds. - and I've tried with every iteration of xorg.conf I can think of (including "none at all").

    I'll try 3.14 as soon as I can be sure ZFS will build on it/

    Leave a comment:


  • Tyler_K
    replied
    Originally posted by Tyler_K View Post
    Yep, looks like this got busted recently.
    Looks to me like it was indeed the runtime PM changes that broke this. However, with kernel 3.14, a small update to the code logic restores this functionality on non-PX radeon+radeon systems.

    Users of other adapters (ex. nouveau, intel) should check to see if similar runtime PM logic has been adopted in the respective DRM/kernel driver.

    Leave a comment:


  • chithanh
    replied
    Originally posted by stoatwblr View Post
    I've got a dual-GPU 4-output nvidia NVS440 card
    That is your hardware, but what is your software and configuration? Also if you have a driver section in xorg.conf, then you need one for each of your GPUs. (I suggest to have no xorg.conf at all.)

    Leave a comment:


  • stoatwblr
    replied
    Originally posted by chithanh View Post
    The options are described in the xrandr manpage, in the section "RandR version 1.4 options"
    Code:
    $ xrandr --listproviders
    would be a start.

    Unfortunately there are no examples at the bottom, but it is not difficult to figure out.
    I've got a dual-GPU 4-output nvidia NVS440 card and xrandr --listproviders only shows one GPU no matter what I do (2 heads per GPU)

    I'd really like to know what version of X the original poster was using and what config he used (as would others).

    Leave a comment:


  • Tyler_K
    replied
    Originally posted by kbios View Post
    I tried starting from 13.10 and upgrading mesa and nouveau ddx and it still works. so the problem is either in the new kernel or in xorg 1.15. Figuring out a way to try those...
    Yep, looks like this got busted recently. On openSUSE 13.1 (using the stock 3.11.x kernel and driver stack) it works. But updating to newer components has borked it. See https://bugzilla.novell.com/show_bug.cgi?id=867499

    Hoping more people can look into it.

    Leave a comment:


  • kbios
    replied
    Originally posted by kbios View Post
    Well, today I tried again with a current kubuntu trusty and I cannot get the secondary card working whatever I try. With 13.10 it works fine... If someone would like to help me figure it out here are the relevant logs:

    http://pastebin.com/hEuE8SZE working dmesg (13.10)
    http://pastebin.com/WeSByULi working Xorg.0.log

    http://pastebin.com/wSSH3r91 bad dmesg (14.04)
    http://pastebin.com/dTrgDnAd bad Xorg.0.log

    As you can see, Xorg doesn't load nouveau on the second card anymore. I'd report a bug, but I don't know exactly against which component...

    Thanks
    Pseudo-edit: I tried starting from 13.10 and upgrading mesa and nouveau ddx and it still works. so the problem is either in the new kernel or in xorg 1.15. Figuring out a way to try those...

    Leave a comment:


  • kbios
    replied
    doesn't seem to work anymore...

    Well, today I tried again with a current kubuntu trusty and I cannot get the secondary card working whatever I try. With 13.10 it works fine... If someone would like to help me figure it out here are the relevant logs:

    http://pastebin.com/hEuE8SZE working dmesg (13.10)
    http://pastebin.com/WeSByULi working Xorg.0.log

    http://pastebin.com/wSSH3r91 bad dmesg (14.04)
    http://pastebin.com/dTrgDnAd bad Xorg.0.log

    As you can see, Xorg doesn't load nouveau on the second card anymore. I'd report a bug, but I don't know exactly against which component...

    Thanks

    Leave a comment:


  • BlackAdder
    replied
    Well, I am having problems myself with a configuration similar to that of the OP... only it is with two ATI video cards instead of nvidia.

    I am runing opensuse 13.1 in a machine with a PCIe "main" video card (ATI 6570, Turks) and a PCI "secondary" card (ATI 5430, Cedar). I am using the open source "radeon" driver for both. The main card has two monitors attached (DVI), and the secondary card a third DVI monitor. After multiple trial and error, it turns out that Tyler's recommendation was the simplest: remove xorg.conf and let the autoconfiguration do its thing. One small nag is that initially only the main video card is detected and configured, so I start with a dual-monitor setup "only". Kscreen sees both monitors, which incidentally have a different resolution, and lets me place them side-by-side, on top of each other... whatever I want. It works wonderfully. I have a big desktop, can move windows form one monitor to the other, desktop effects are enabled (without Xinerama!)... up to this point, it works great.

    The next step is detecting the third monitor... easy enough, "xrandr -q" only shows the two ones attached to the main card but after running "xrandr --setprovideroutputsource 1 0 " as stated in another post (and having previously verified that there are indeed two providers seen) it will show the second videocard and its outputs, one of them connected to a monitor. Excellent! (just a side comment: I can skip this by changing the "boot" video card in my BIOS from PCIe to PCI; if I do this, everything behaves the same way once I enter the X server but the PCI card/third monitor are already being detected).

    To set up the third monitor I go to kscreen, which is now showing it (initially greyed, as in disabled). Place it side-by-side with the other ones, enable it, and voila!, the third monitor comes to life as it should. However, as soon as I put something (even if it is just the mouse pointer) inside that third workspace, the whole X server chokes and slows down to the point of not being usable (it takes ~ 5 seconds to register a mouse click, to give you an idea). Even if I now move the mouse back to the two original monitors the server is already choked to death and does not recover. Btw, this happens only 1 out of every 4 or 5 times... the other 3-4 times X simply crashes and goes back to the login screen upon activation of the third monitor.

    When the slowdown happens I don't see anything strange in the logs (messages or Xorg.0.log), and there is no additional CPU load, so I don't really know how to proceed. I have checked in the logs that "glamoregl" is being loaded for the cards, so it may be true that so far glamor does not play nice with a two-card ATI setup.

    If anybody has suggestions on how to try and diagnose this, I would really appreciate them.

    Thanks!

    Leave a comment:


  • Tyler_K
    replied
    Originally posted by Tyler_K View Post
    I have no idea whether intel+SI(glamor) will work or not.
    scratch that. I just saw a very recent xorg log of such a config, and it appears to work fine. So the problem I alluded seems to be either confined to using 2 AMD adapters with glamor, or maybe just two adapters using glamor.

    Leave a comment:


  • Tyler_K
    replied
    Originally posted by azyr View Post
    Any ideas?
    yes -- you're following a guide that does not pertain to you.

    Blow away all xorg configurations you did. They are not needed. You should be letting X autoconfig itself. A kink that you may face is that your radeon adapter is an GCN/SI arch. When using that adapter with the OSS driver stack, getting the best performance that you can requires using glamor. With a single AMD adapter (SI or otherwise), glamor works. With two AMD adapters (SI+SI, SI+otherwise, or otherwise+otherwise), glamor borks. (See this thread: http://phoronix.com/forums/showthrea...m-Ubuntu-13-10 ... note: I use openSUSE, so this is not something particular to Ubuntu). Given that, I have no idea whether intel+SI(glamor) will work or not. If it does not work for you, then you should add your feedback to that existing bug report to indicate that the problem is not particular to two AMD adapters utilizing glamor, but rather glamor itself when it comes to the use of two adapters (pure AMD or mixed configs).

    Anyway, (assuming that your system meets the requirements (these are listed in the xrandr --help under the provider objects section) <-- your's should; that was just a general disclaimer for any other reader who might blindly follow), all you need to do is boot up as per usual and:
    Code:
    xrandr --setprovideroutputsource 1 0
    Then, after that, turn on the output for the second adapter -- you can do that via xrandr or one of the gui's (I suggest you use kscreen as it will be obvious what to do, though any of the other RANDR frontends will work too, albeit just not as intuitively as it is with kscreen). After that, you are left on your own as to how to make that config persistent across boots without any further intervention on your part (I added it as a start up script used by the Display manager, but there are ninety-four gazillion other ways you can accomplish it).

    Leave a comment:

Working...
X