Basic Optimus support, as in just being able to at least switch to the NVidia GPU for everything... I don't mind not having dynamic adaptive switching depending application opened.
This should be a top priority issue... NVidia is completely USELESS on Linux on all new laptops powered by this Optimus thing.
Announcement
Collapse
No announcement yet.
What Do You Want From NVIDIA's Next Driver?
Collapse
X
-
Originally posted by curaga View PostAre the pretty effects really that necessary?
I'd think 3+ screens were for the screen space, not having three times more cubes spinning.
Aside from any benefits to be gained from Compiz, The unified desktop across multiple GPUs aspect of Xinerama isn't perfect and still has some bugs.
Leave a comment:
-
Originally posted by kazetsukai View PostEven in RandR 1.3, I doubt compositing will work out-of-box across multiple GPUs unless RandR is doing something crazy (see below).
Originally posted by kazetsukai View PostOriginally posted by pingufunkybeatSeems like you want to use the same X session across several GPUs?
Originally posted by kazetsukai View PostOriginally posted by pingufunkybeatI wasn't aware that this was a common use case.
What I do on one GPU has zero effect on the performance of the others- this makes the desktop feel much more responsive. I imagine RandR's solution will be creating a large framebuffer on one card and then copy the "offscreen" portions to the next GPU. This would mean that one GPU is doing all of the work while the others are just dummy displays. This isn't a real solution to the problem.
Leave a comment:
-
Are the pretty effects really that necessary?
I'd think 3+ screens were for the screen space, not having three times more cubes spinning.
Leave a comment:
-
Originally posted by pingufunkybeat View PostRead on:Zaphod mode works just fine. But I do not use a multi-screen setup, so maybe I'm not up to date.
Originally posted by pingufunkybeat View PostSeems like you want to use the same X session across several GPUs?
Originally posted by pingufunkybeat View PostI wasn't aware that this was a common use case.
What I do on one GPU has zero effect on the performance of the others- this makes the desktop feel much more responsive. I imagine RandR's solution will be creating a large framebuffer on one card and then copy the "offscreen" portions to the next GPU. This would mean that one GPU is doing all of the work while the others are just dummy displays. This isn't a real solution to the problem.
Leave a comment:
-
Read on:
These specific issues are resolved in RandR 1.3
Seems like you want to use the same X session across several GPUs?
I wasn't aware that this was a common use case.
Leave a comment:
-
Originally posted by pingufunkybeat View PostNo.This is a problem in the Nvidia driver, as it uses Xinerama (which is deprecated) instead of the solution which replaced it, namely RandR.
Originally posted by pingufunkybeat View PostATi doesn't have this problem, as it supports RandR. As do the open drivers for all cards.
"RandR 1.2 only permits one virtual screen per display device. It is not possible to assign each monitor on a device to a different screen (sometimes called "Zaphod" mode), or to combine monitors from multiple devices into a single screen."
All devices experience this limitation regardless of make. If you can show me a multi-card setup with Compositing without using Xgl, that's a different story.
Leave a comment:
-
ATI obviates the need for Xinerama by having a single card that does three screens.
I believe nVidia on Windows does some funky stuff so that when running two cards it presents a unified frame store to games.
Xorg itself isn't going to be able do a unified desktop on multi card without Xinerama in a hurry so nVidia's only hope is to present two cards as one virtual card to Xorg but still publish the relevant per monitor stuff like Twinview does.
Leave a comment:
-
No.
This is a problem in the Nvidia driver, as it uses Xinerama (which is deprecated) instead of the solution which replaced it, namely RandR.
ATi doesn't have this problem, as it supports RandR. As do the open drivers for all cards.
The correct solution is to provide support for the existing X technology instead of using a deprecated on which will never be fixed.
Leave a comment:
-
Originally posted by mugginz View PostIf not the full 3D vision, at least the component in their Windows driver that makes their triple head stuff across two cards work well.
There are a few solutions to this:
-Get Composite working within Xinerama setups by contributing to Xorg
-Sell a card with more RAMDACs and outputs (up to the hardware vendor really, not NVIDIA)
-Enable SLI Mosaic mode on consumer cards without SLI motherboard hardware to share the framebuffer between GPUs
I think contributing to Xorg would be the best solution, as currently everyone is SOL on compositing in Xinerama without a deprecated solution like Xgl.
Leave a comment:
Leave a comment: