Announcement

Collapse
No announcement yet.

NVIDIA Has Major New Linux Driver: Optimus, RandR 1.4

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • linuxforall
    replied
    Anyone done any power and heat testing for nvidia 319 with optimus and optimus prime versus nvidia driver+bumblebee?

    Leave a comment:


  • linuxforall
    replied
    It seems Ubuntu 12.04.3 LTS has added support for Optimus via nvidia-prime package and patched 3.8 kernel. I installed it on my i7 laptop with nvidia 630M. Incidentally it installs the nvidia driver by default so when you boot, you are already using nvidia 319 series driver.

    I don't know if optimus is working or not as I don't play any games but laptop battery life has gone up and temps have fallen. I still need to test HDMI out. This wiki also indicates hybrid support for ATI and I have enabled that on my small ATI AMD C50 chipset netbook as well.



    Leave a comment:


  • Thaodan
    replied
    Originally posted by jaylittle View Post
    I've got a long running issue with the Nvidia drivers in Linux that I was hoping somebody here could help me resolve. The situation doesn't appear to have changed with the release of 319.12 despite promises of initial xrandr 1.4 support (the lack of which I believe is the root of the issue). I've got a System76 BonX6 laptop with an Nvidia GTX 670M chipset. My LCD has a native resolution of 1920x1080. However I prefer to run with 1600x900 if possible (or any lower resolution - which one doesn't really matter here). In any event when I switch the resolution using either nvidia-settings or xrandr on the command line - my desktop environment never seems to resize to match it. So even though my new resolution is something smaller, the DE panels and what not are now oversized as if I was still running with 1920x1080 and hence cut off from view in the lower resolution. It's as if the DE never receives the notification that the resolution has changed.

    I have experienced this in both Gnome 3 and Cinnamon among other DEs. Is there a work around for this? I'm running Arch Linux so I'm running all of the absolute most cutting edge versions of everything for the most part. Any pointers or feedback would be most appreciated.
    In KDE this is a known bug the KDE bug tracker has an entry for this. You could avoid using fullscreen programms in lower resolutions or switch to window mode instead. If you use wine you could use virtual desktop.

    Leave a comment:


  • Naib
    replied
    anyone got profiles working?

    Leave a comment:


  • Banu22elos
    replied
    Most people who fawn over Linus giving the finger didn't even watch the video.

    Leave a comment:


  • jaylittle
    replied
    Nvidia & Resolution Switching

    I've got a long running issue with the Nvidia drivers in Linux that I was hoping somebody here could help me resolve. The situation doesn't appear to have changed with the release of 319.12 despite promises of initial xrandr 1.4 support (the lack of which I believe is the root of the issue). I've got a System76 BonX6 laptop with an Nvidia GTX 670M chipset. My LCD has a native resolution of 1920x1080. However I prefer to run with 1600x900 if possible (or any lower resolution - which one doesn't really matter here). In any event when I switch the resolution using either nvidia-settings or xrandr on the command line - my desktop environment never seems to resize to match it. So even though my new resolution is something smaller, the DE panels and what not are now oversized as if I was still running with 1920x1080 and hence cut off from view in the lower resolution. It's as if the DE never receives the notification that the resolution has changed.

    I have experienced this in both Gnome 3 and Cinnamon among other DEs. Is there a work around for this? I'm running Arch Linux so I'm running all of the absolute most cutting edge versions of everything for the most part. Any pointers or feedback would be most appreciated.

    Leave a comment:


  • cchandel
    replied
    Newer Kernel / xrandr 1.4 / intel driver

    To get this working - you need the following :-

    a) Kernel 3.9
    b) randr 1.4
    c) intel driver set in the xorg file

    I've got this working - and using a kernel < 3.9 is a no go as is X server < 1.14.

    You don't need to set the intel GPU as modesetting - doesn't work.

    I have a writeup at http://www.barunisystems.com/index.p...page?view=blog for more details.

    Leave a comment:


  • tuke81
    replied
    Originally posted by dh04000 View Post
    http://www.phoronix.com/scan.php?pag...tem&px=MTI1MTM

    Does this mean 13.04 will have this feature? Does intel support it yet?
    Hmm ubuntu 13.04 seems to have compiz 0.9.9 and this is just merging in compiz 0.9.10...

    Dunno about intel, mesa has EGL_EXT_buffer_age so I presume intel has it, but not GLX_EXT_buffer_age.

    Leave a comment:


  • dh04000
    replied
    Originally posted by tuke81 View Post
    GLX_EXT_buffer_age should fix those problems. It's compositing window managers which have to take advantage of it.
    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite


    Does this mean 13.04 will have this feature? Does intel support it yet?

    Leave a comment:


  • elanthis
    replied
    Originally posted by Ericg View Post
    Proper Optimus: Seamless transition back and forth as needed
    Nope. That isn't even possible with OpenGl or D3D. The cards have different capabilities, you must recreate your device/context and requery extensions/caps to switch. There's no signal in GL to tell an app to do this, and while you could do it in D3D they don't.

    This is an issue for web games since either all pages render on Intel or all render on NVIDIA. There's no API to ask for the low-power or high-speed device, so even a multi-process arch like Chrome is stuck.

    Optimus is a stop-gap until GL/D3D offers a real solution in its API.

    Optimus is per-app. There's a system list of apps which use the NVIDIA, requiring updates so new games work. Also requires extra steps so your own projects use it, as by default any binary you build will only use Intel's GPU.

    Leave a comment:

Working...
X