Announcement

Collapse
No announcement yet.

NVIDIA Has Major New Linux Driver: Optimus, RandR 1.4

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by LLStarks View Post
    I'm pretty close to getting "Optimus" working. Unfortunately, my Nvidia card isn't wired to HDMI or any output and the LVDS doesn't want to play nice.

    I'm getting a blank, backlit screen. X will start without complaining, run programs invisibly and glxinfo will report Nvidia, but there's nothing on the LVDS or HDMI.


    Providers for Nvidia and modesetting (or Intel, I've tested both) report properly.
    Same here. If you make any progress, it would be great if you could send me a PM to point me in the right direction. Btw, I wonder if this error in Xorg.0.log
    could be related:
    [ 2791.538] (EE) Screen 1 deleted because of no matching config section.
    [ 2791.538] (II) UnloadModule: "intel"

    Comment


    • #42
      Originally posted by zzippy View Post
      Same here. If you make any progress, it would be great if you could send me a PM to point me in the right direction. Btw, I wonder if this error in Xorg.0.log
      could be related:
      [ 2791.538] (EE) Screen 1 deleted because of no matching config section.
      [ 2791.538] (II) UnloadModule: "intel"
      Are you configured your X as suggested in readme:


      You will need a xorg modesetting driver for kms(in ubuntu it's named xserver-xorg-video-modesetting).

      Comment


      • #43
        It works - but with some tearing

        Hi,
        I managed to get the NVidia GPU to render throught the Intel GPU to a Laptop Screen as well as over HDMI to and external monitor.

        Process detailed at http://www.barunisystems.com/index.p...page?view=blog
        Regards,
        Chaitanya

        Comment


        • #44
          Originally posted by tuke81 View Post
          Are you configured your X as suggested in readme:
          http://us.download.nvidia.com/XFree8...E/randr14.html
          Yep. If you mean the xorg.conf file. If you mean the kernel CONFIG_DRM parameters -admit beyond my knowledge- : I don't know. Using Ubuntu Mainline 3.9 rc6. Could this be the point?
          Originally posted by tuke81 View Post
          You will need a xorg modesetting driver for kms(in ubuntu it's named xserver-xorg-video-modesetting).
          Thought Intel >2.21.5 should work also? I started with modesetting driver, same.

          Comment


          • #45
            Is the tearing issue fixable by Nvidia or Intel via driver changes or is this something that the technology just has to live with?

            Comment


            • #46
              Originally posted by dh04000 View Post
              Is the tearing issue fixable by Nvidia or Intel via driver changes or is this something that the technology just has to live with?
              GLX_EXT_buffer_age should fix those problems. It's compositing window managers which have to take advantage of it.

              Comment


              • #47
                Originally posted by Ericg View Post
                Proper Optimus: Seamless transition back and forth as needed
                Nope. That isn't even possible with OpenGl or D3D. The cards have different capabilities, you must recreate your device/context and requery extensions/caps to switch. There's no signal in GL to tell an app to do this, and while you could do it in D3D they don't.

                This is an issue for web games since either all pages render on Intel or all render on NVIDIA. There's no API to ask for the low-power or high-speed device, so even a multi-process arch like Chrome is stuck.

                Optimus is a stop-gap until GL/D3D offers a real solution in its API.

                Optimus is per-app. There's a system list of apps which use the NVIDIA, requiring updates so new games work. Also requires extra steps so your own projects use it, as by default any binary you build will only use Intel's GPU.

                Comment


                • #48
                  Originally posted by tuke81 View Post
                  GLX_EXT_buffer_age should fix those problems. It's compositing window managers which have to take advantage of it.
                  Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite


                  Does this mean 13.04 will have this feature? Does intel support it yet?

                  Comment


                  • #49
                    Originally posted by dh04000 View Post
                    http://www.phoronix.com/scan.php?pag...tem&px=MTI1MTM

                    Does this mean 13.04 will have this feature? Does intel support it yet?
                    Hmm ubuntu 13.04 seems to have compiz 0.9.9 and this is just merging in compiz 0.9.10...

                    Dunno about intel, mesa has EGL_EXT_buffer_age so I presume intel has it, but not GLX_EXT_buffer_age.

                    Comment


                    • #50
                      Newer Kernel / xrandr 1.4 / intel driver

                      To get this working - you need the following :-

                      a) Kernel 3.9
                      b) randr 1.4
                      c) intel driver set in the xorg file

                      I've got this working - and using a kernel < 3.9 is a no go as is X server < 1.14.

                      You don't need to set the intel GPU as modesetting - doesn't work.

                      I have a writeup at http://www.barunisystems.com/index.p...page?view=blog for more details.

                      Comment

                      Working...
                      X