Announcement

Collapse
No announcement yet.

What Do You Want From NVIDIA's Next Driver?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #71
    Originally posted by pingufunkybeat View Post
    No.This is a problem in the Nvidia driver, as it uses Xinerama (which is deprecated) instead of the solution which replaced it, namely RandR.
    RandR doesn't obsolete Xinerama, this is a very common misconception.

    Originally posted by pingufunkybeat View Post
    ATi doesn't have this problem, as it supports RandR. As do the open drivers for all cards.
    Ala Wikipedia:
    "RandR 1.2 only permits one virtual screen per display device. It is not possible to assign each monitor on a device to a different screen (sometimes called "Zaphod" mode), or to combine monitors from multiple devices into a single screen."

    All devices experience this limitation regardless of make. If you can show me a multi-card setup with Compositing without using Xgl, that's a different story.

    Comment


    • #72
      Read on:

      These specific issues are resolved in RandR 1.3
      Zaphod mode works just fine. But I do not use a multi-screen setup, so maybe I'm not up to date.

      Seems like you want to use the same X session across several GPUs?

      I wasn't aware that this was a common use case.

      Comment


      • #73
        Originally posted by pingufunkybeat View Post
        Read on:Zaphod mode works just fine. But I do not use a multi-screen setup, so maybe I'm not up to date.
        Even in RandR 1.3, I doubt compositing will work out-of-box across multiple GPUs unless RandR is doing something crazy (see below).

        Originally posted by pingufunkybeat View Post
        Seems like you want to use the same X session across several GPUs?
        I want to, as it is kind of the "holy grail" of massive multimonitor setups, but cannot while keeping Composite functionality. A big contributing factor to this is that I run KDE4- and KDE4 will not use more than one X Screen. If KDE4 did, it would be alot less painful.

        Originally posted by pingufunkybeat View Post
        I wasn't aware that this was a common use case.
        Arguably it isn't. I'm running 5 screens, you cannot really do that on one GPU. Some recent ATI cards released with 6 displayport outputs but A) I'm on HDMI and B) you get higher performance by using multiple GPUs individually.

        What I do on one GPU has zero effect on the performance of the others- this makes the desktop feel much more responsive. I imagine RandR's solution will be creating a large framebuffer on one card and then copy the "offscreen" portions to the next GPU. This would mean that one GPU is doing all of the work while the others are just dummy displays. This isn't a real solution to the problem.

        Comment


        • #74
          Are the pretty effects really that necessary?

          I'd think 3+ screens were for the screen space, not having three times more cubes spinning.

          Comment


          • #75
            Originally posted by kazetsukai View Post
            Even in RandR 1.3, I doubt compositing will work out-of-box across multiple GPUs unless RandR is doing something crazy (see below).
            Unfortunately compositing doesn't work across multiple GPU's at the moment.

            Originally posted by kazetsukai View Post
            Originally posted by pingufunkybeat
            Seems like you want to use the same X session across several GPUs?
            I want to, as it is kind of the "holy grail" of massive multimonitor setups, but cannot while keeping Composite functionality. A big contributing factor to this is that I run KDE4- and KDE4 will not use more than one X Screen. If KDE4 did, it would be alot less painful.
            I was in the same boat recently.

            Originally posted by kazetsukai View Post
            Originally posted by pingufunkybeat
            I wasn't aware that this was a common use case.
            Arguably it isn't. I'm running 5 screens, you cannot really do that on one GPU. Some recent ATI cards released with 6 displayport outputs but A) I'm on HDMI and B) you get higher performance by using multiple GPUs individually.

            What I do on one GPU has zero effect on the performance of the others- this makes the desktop feel much more responsive. I imagine RandR's solution will be creating a large framebuffer on one card and then copy the "offscreen" portions to the next GPU. This would mean that one GPU is doing all of the work while the others are just dummy displays. This isn't a real solution to the problem.
            For anyone wanting more than two displays on a Linux desktop the Xorg acorss two GPUs issue is going to hit them on the head. Even with ATI, your max screens before going multi GPU is 6. As for the performance drop of having to shove multiple copies of assets into GPUs across the PCIe bus I guess there's not much that can be done about that.

            Comment


            • #76
              Originally posted by curaga View Post
              Are the pretty effects really that necessary?

              I'd think 3+ screens were for the screen space, not having three times more cubes spinning.
              Compiz isn't just for the spinning cubes.

              Aside from any benefits to be gained from Compiz, The unified desktop across multiple GPUs aspect of Xinerama isn't perfect and still has some bugs.

              Comment


              • #77
                Basic Optimus support, as in just being able to at least switch to the NVidia GPU for everything... I don't mind not having dynamic adaptive switching depending application opened.

                This should be a top priority issue... NVidia is completely USELESS on Linux on all new laptops powered by this Optimus thing.

                Comment

                Working...
                X