Announcement

Collapse
No announcement yet.

What Do You Want From NVIDIA's Next Driver?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • gamesfan
    replied
    Basic Optimus support, as in just being able to at least switch to the NVidia GPU for everything... I don't mind not having dynamic adaptive switching depending application opened.

    This should be a top priority issue... NVidia is completely USELESS on Linux on all new laptops powered by this Optimus thing.

    Leave a comment:


  • mugginz
    replied
    Originally posted by curaga View Post
    Are the pretty effects really that necessary?

    I'd think 3+ screens were for the screen space, not having three times more cubes spinning.
    Compiz isn't just for the spinning cubes.

    Aside from any benefits to be gained from Compiz, The unified desktop across multiple GPUs aspect of Xinerama isn't perfect and still has some bugs.

    Leave a comment:


  • mugginz
    replied
    Originally posted by kazetsukai View Post
    Even in RandR 1.3, I doubt compositing will work out-of-box across multiple GPUs unless RandR is doing something crazy (see below).
    Unfortunately compositing doesn't work across multiple GPU's at the moment.

    Originally posted by kazetsukai View Post
    Originally posted by pingufunkybeat
    Seems like you want to use the same X session across several GPUs?
    I want to, as it is kind of the "holy grail" of massive multimonitor setups, but cannot while keeping Composite functionality. A big contributing factor to this is that I run KDE4- and KDE4 will not use more than one X Screen. If KDE4 did, it would be alot less painful.
    I was in the same boat recently.

    Originally posted by kazetsukai View Post
    Originally posted by pingufunkybeat
    I wasn't aware that this was a common use case.
    Arguably it isn't. I'm running 5 screens, you cannot really do that on one GPU. Some recent ATI cards released with 6 displayport outputs but A) I'm on HDMI and B) you get higher performance by using multiple GPUs individually.

    What I do on one GPU has zero effect on the performance of the others- this makes the desktop feel much more responsive. I imagine RandR's solution will be creating a large framebuffer on one card and then copy the "offscreen" portions to the next GPU. This would mean that one GPU is doing all of the work while the others are just dummy displays. This isn't a real solution to the problem.
    For anyone wanting more than two displays on a Linux desktop the Xorg acorss two GPUs issue is going to hit them on the head. Even with ATI, your max screens before going multi GPU is 6. As for the performance drop of having to shove multiple copies of assets into GPUs across the PCIe bus I guess there's not much that can be done about that.

    Leave a comment:


  • curaga
    replied
    Are the pretty effects really that necessary?

    I'd think 3+ screens were for the screen space, not having three times more cubes spinning.

    Leave a comment:


  • kazetsukai
    replied
    Originally posted by pingufunkybeat View Post
    Read on:Zaphod mode works just fine. But I do not use a multi-screen setup, so maybe I'm not up to date.
    Even in RandR 1.3, I doubt compositing will work out-of-box across multiple GPUs unless RandR is doing something crazy (see below).

    Originally posted by pingufunkybeat View Post
    Seems like you want to use the same X session across several GPUs?
    I want to, as it is kind of the "holy grail" of massive multimonitor setups, but cannot while keeping Composite functionality. A big contributing factor to this is that I run KDE4- and KDE4 will not use more than one X Screen. If KDE4 did, it would be alot less painful.

    Originally posted by pingufunkybeat View Post
    I wasn't aware that this was a common use case.
    Arguably it isn't. I'm running 5 screens, you cannot really do that on one GPU. Some recent ATI cards released with 6 displayport outputs but A) I'm on HDMI and B) you get higher performance by using multiple GPUs individually.

    What I do on one GPU has zero effect on the performance of the others- this makes the desktop feel much more responsive. I imagine RandR's solution will be creating a large framebuffer on one card and then copy the "offscreen" portions to the next GPU. This would mean that one GPU is doing all of the work while the others are just dummy displays. This isn't a real solution to the problem.

    Leave a comment:


  • pingufunkybeat
    replied
    Read on:

    These specific issues are resolved in RandR 1.3
    Zaphod mode works just fine. But I do not use a multi-screen setup, so maybe I'm not up to date.

    Seems like you want to use the same X session across several GPUs?

    I wasn't aware that this was a common use case.

    Leave a comment:


  • kazetsukai
    replied
    Originally posted by pingufunkybeat View Post
    No.This is a problem in the Nvidia driver, as it uses Xinerama (which is deprecated) instead of the solution which replaced it, namely RandR.
    RandR doesn't obsolete Xinerama, this is a very common misconception.

    Originally posted by pingufunkybeat View Post
    ATi doesn't have this problem, as it supports RandR. As do the open drivers for all cards.
    Ala Wikipedia:
    "RandR 1.2 only permits one virtual screen per display device. It is not possible to assign each monitor on a device to a different screen (sometimes called "Zaphod" mode), or to combine monitors from multiple devices into a single screen."

    All devices experience this limitation regardless of make. If you can show me a multi-card setup with Compositing without using Xgl, that's a different story.

    Leave a comment:


  • mugginz
    replied
    ATI obviates the need for Xinerama by having a single card that does three screens.

    I believe nVidia on Windows does some funky stuff so that when running two cards it presents a unified frame store to games.

    Xorg itself isn't going to be able do a unified desktop on multi card without Xinerama in a hurry so nVidia's only hope is to present two cards as one virtual card to Xorg but still publish the relevant per monitor stuff like Twinview does.

    Leave a comment:


  • pingufunkybeat
    replied
    No.

    This is a problem in the Nvidia driver, as it uses Xinerama (which is deprecated) instead of the solution which replaced it, namely RandR.

    ATi doesn't have this problem, as it supports RandR. As do the open drivers for all cards.

    The correct solution is to provide support for the existing X technology instead of using a deprecated on which will never be fixed.

    Leave a comment:


  • kazetsukai
    replied
    Originally posted by mugginz View Post
    If not the full 3D vision, at least the component in their Windows driver that makes their triple head stuff across two cards work well.
    Hah. These aren't problems in the NVIDIA driver, they're problems in Xorg. The Xorg composite extension does not work with Xinerama.

    There are a few solutions to this:
    -Get Composite working within Xinerama setups by contributing to Xorg
    -Sell a card with more RAMDACs and outputs (up to the hardware vendor really, not NVIDIA)
    -Enable SLI Mosaic mode on consumer cards without SLI motherboard hardware to share the framebuffer between GPUs

    I think contributing to Xorg would be the best solution, as currently everyone is SOL on compositing in Xinerama without a deprecated solution like Xgl.

    Leave a comment:

Working...
X