NVIDIA Optimus Unofficially Comes To Linux
NVIDIA's Optimus multi-GPU technology now works under Linux. Well, at least for some notebooks, it's been hacked together by an open-source developer and in fact is working to use both Intel and NVIDIA graphics processors simultaneously with the respective drivers. This is the best Linux implementation we've seen yet with NVIDIA Corp still not announcing plans to officially support this technology under non-Microsoft operating systems.
Back in early 2010 we learned that NVIDIA had no plans to bring this support to Linux that provides seamless switching for notebooks between Intel and NVIDIA graphics, depending upon workload and power requirements to provide the best overall experience while hopefully extending your laptop's battery life.
There was then a hacked-up proof-of-concept called multi-GPU PRIME rendering written by Red Hat's David Airlie. That work though to this day has remained experimental and is not maintained.
As last year progressed, hybrid graphics support under Linux was still bad. There is though VGA/ASUS Switcheroo for providing some GPU switching support under Linux, but it's far from being seamless and requires manually restarting the X.Org Server, etc. It's also not working with all hardware.
Near the end of April we were beginning to wonder whether NVIDIA Optimus would inevitably come to Linux with NVIDIA expected to announce their Synergy technology at Computex Taipei in June. This upcoming NVIDIA technology is expected to be like Optimus for notebooks, but on desktops instead. When this technology is deployed, NVIDIA may have to deliver on Linux support if their sales volumes under Linux ramp up and this technology becomes much more widespread.
However, an open-source community developer has took things into his own hands and began hacking on some "open-source Optimus" code. He's released the code over the weekend as prime-ng 2.0.
The prime-ng 2.0 code isn't about graphics switching, but rather offloading of work between Intel / NVIDIA GPUs. It's even working on NVIDIA Optimus laptops that don't have the graphics multi-plexer common to other laptops for switching between the graphics processors. This prime-ng code enables 3D acceleration to be used with Intel and NVIDIA graphics simultaneously.
With prime-ng 2.0, the Intel GPU is powering the display and basic desktop while any application(s) the user wishes to run through the NVIDIA GPU can be done by simply running optirun64 [the-application-name] to direct it through the NVIDIA GPU. So this implementation still is not seamless nor does it use any profiles or anything like is found with NVIDIA's Optimus implementation on Windows, but it's a big step forward.
At this point, the code also doesn't power-down the NVIDIA GPU when not in use, so it will continue burning through your battery. So far this prime-ng code has been confirmed to work on several ASUS, Alienware, Acer, and Dell notebooks.
On the plus side, it appears this code works when using the NVIDIA binary driver and just not the open-source Nouveau driver. While the Nouveau driver is nice for some, its usually lacking in terms of power management and other features, the actual ASIC support can be a hit-or-miss between Mesa and Linux kernel releases, and it doesn't provide serious 3D performance for most. This also means though that this implementation isn't sharing GEM/TTM buffers directly or anything in regards to the two drivers actually communicating and interacting with one another.
Another limitation though of this currently implementation is that NVIDIA's VDPAU video acceleration isn't working when the Intel IGP is powering the display, but there may be some ways to work around it.
This prime-ng support was written by a Danish developer, Martin Juhl, and its actual implementation involves using multiple X.Org Servers (one for each GPU) and VirtualGL to make everything work. So it's far from being a polished implementation, but hopefully this will be suitable as an interim solution until NVIDIA decides to officially support Optimus/Synergy in all of its glory under Linux. Even the official implementation though may not be as pleasant as the Microsoft Windows experience due to shortcomings with X.Org, but will eventually be fixed in Wayland.
The project's code is found on GitHub and more information is available via the developer's blog.
Back in early 2010 we learned that NVIDIA had no plans to bring this support to Linux that provides seamless switching for notebooks between Intel and NVIDIA graphics, depending upon workload and power requirements to provide the best overall experience while hopefully extending your laptop's battery life.
There was then a hacked-up proof-of-concept called multi-GPU PRIME rendering written by Red Hat's David Airlie. That work though to this day has remained experimental and is not maintained.
As last year progressed, hybrid graphics support under Linux was still bad. There is though VGA/ASUS Switcheroo for providing some GPU switching support under Linux, but it's far from being seamless and requires manually restarting the X.Org Server, etc. It's also not working with all hardware.
Near the end of April we were beginning to wonder whether NVIDIA Optimus would inevitably come to Linux with NVIDIA expected to announce their Synergy technology at Computex Taipei in June. This upcoming NVIDIA technology is expected to be like Optimus for notebooks, but on desktops instead. When this technology is deployed, NVIDIA may have to deliver on Linux support if their sales volumes under Linux ramp up and this technology becomes much more widespread.
However, an open-source community developer has took things into his own hands and began hacking on some "open-source Optimus" code. He's released the code over the weekend as prime-ng 2.0.
The prime-ng 2.0 code isn't about graphics switching, but rather offloading of work between Intel / NVIDIA GPUs. It's even working on NVIDIA Optimus laptops that don't have the graphics multi-plexer common to other laptops for switching between the graphics processors. This prime-ng code enables 3D acceleration to be used with Intel and NVIDIA graphics simultaneously.
With prime-ng 2.0, the Intel GPU is powering the display and basic desktop while any application(s) the user wishes to run through the NVIDIA GPU can be done by simply running optirun64 [the-application-name] to direct it through the NVIDIA GPU. So this implementation still is not seamless nor does it use any profiles or anything like is found with NVIDIA's Optimus implementation on Windows, but it's a big step forward.
At this point, the code also doesn't power-down the NVIDIA GPU when not in use, so it will continue burning through your battery. So far this prime-ng code has been confirmed to work on several ASUS, Alienware, Acer, and Dell notebooks.
On the plus side, it appears this code works when using the NVIDIA binary driver and just not the open-source Nouveau driver. While the Nouveau driver is nice for some, its usually lacking in terms of power management and other features, the actual ASIC support can be a hit-or-miss between Mesa and Linux kernel releases, and it doesn't provide serious 3D performance for most. This also means though that this implementation isn't sharing GEM/TTM buffers directly or anything in regards to the two drivers actually communicating and interacting with one another.
Another limitation though of this currently implementation is that NVIDIA's VDPAU video acceleration isn't working when the Intel IGP is powering the display, but there may be some ways to work around it.
This prime-ng support was written by a Danish developer, Martin Juhl, and its actual implementation involves using multiple X.Org Servers (one for each GPU) and VirtualGL to make everything work. So it's far from being a polished implementation, but hopefully this will be suitable as an interim solution until NVIDIA decides to officially support Optimus/Synergy in all of its glory under Linux. Even the official implementation though may not be as pleasant as the Microsoft Windows experience due to shortcomings with X.Org, but will eventually be fixed in Wayland.
The project's code is found on GitHub and more information is available via the developer's blog.
14 Comments