Announcement

Collapse
No announcement yet.

NVIDIA Has Major New Linux Driver: Optimus, RandR 1.4

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by tuke81 View Post
    Yep, it sure does among of other things:
    http://us.download.nvidia.com/XFree8...Supportedd38df
    Wouldn't doubt that this was a request put in for Valve. It might also mean better SLi support in linux in the future now that the first steps of per application optimizing has been implemented.

    Comment


    • #17
      FYI, looks like there maybe a vdpau bug with XBMC with these drivers BTW (at least with a GT-520 card).

      Comment


      • #18
        Originally posted by deanjo View Post
        FYI, looks like there maybe a vdpau bug with XBMC with these drivers BTW (at least with a GT-520 card).
        What problem are you having?

        Comment


        • #19
          Originally posted by johnc View Post
          What problem are you having?
          Green screen on vdpau decoded material. Got to investigate a bit more but I have a feeling it's a HQ scaler issue.

          Comment


          • #20
            Originally posted by DanL View Post
            The intel GPU will still be active (it has to be because of the nature of Optimus), so I would expect slightly less battery life. However, this is definitely a great option when you're hooked to AC.
            It's not 'slightly less', it's 'a lot lesser'.

            In some notebooks bumblebee can successfully turn off the Nvidia chip while the Intel core does all the graphics rendering, so the discreet core never actually starts up until explicitly told to do so with optirun.

            With Nvidia's driver both are running at the same time, and I think you underestimate the power drain of the Nvidia chip.

            Comment


            • #21
              So I've got it mostly working with my Optimus card. I can get output through HDMI (wired into the nvidia card) but I'm getting this error when running "xrandr --setprovideroutputsource modesetting NVIDIA-0".
              Code:
              X Error of failed request:  BadValue (integer parameter out of range for operation)
                Major opcode of failed request:  140 (RANDR)
                Minor opcode of failed request:  35 ()
                Value in failed request:  0x2c3
                Serial number of failed request:  16
                Current serial number in output stream:  17
              So I cant use the laptop screen or the VGA port.
              My "xrandr --listproviders" looks like
              Code:
              Providers: number : 2
              Provider 0: id: 0x2c3 cap: 0x0 crtcs: 2 outputs: 1 associated providers: 0 name:NVIDIA-0
              Provider 1: id: 0x45 cap: 0x2, Sink Output crtcs: 2 outputs: 4 associated providers: 0 name:modesetting
              Any ideas?

              Comment


              • #22
                Will nVidia enable overclocking on Linux? Any news on this?

                Comment


                • #23
                  Anybody notice how much smaller this thing is?:

                  Index of /XFree86/Linux-x86_64/313.30/

                  Name Size Date Modified
                  [parent directory]
                  NVIDIA-Linux-x86_64-313.30-no-compat32.run 37.9 MB 3/28/13 11:35:00 PM
                  NVIDIA-Linux-x86_64-313.30.run 65.2 MB 3/28/13 11:35:00 PM
                  Index of /XFree86/Linux-x86_64/319.12/

                  Name Size Date Modified
                  [parent directory]
                  NVIDIA-Linux-x86_64-319.12-no-compat32.run 26.0 MB 4/4/13 10:31:00 PM
                  NVIDIA-Linux-x86_64-319.12.run 46.3 MB 4/4/13 10:31:00 PM

                  Comment


                  • #24
                    Originally posted by ArchLinux View Post
                    Anybody notice how much smaller this thing is?
                    Well, this answers my question. Nobody read the changelog

                    Among other cool bits listed in it (you can finally add new resolutions with the xrandr CLI, nvidia-settings lists non-native resolutions, ...), there's an item saying they've switched from gzip to xz for compressing the installer.

                    Comment


                    • #25
                      Originally posted by ArchLinux View Post
                      Anybody notice how much smaller this thing is?
                      From here
                      Switched .run package compression from gzip to xz.This provides a higher level of compression.
                      This seems to be the reason.

                      Comment


                      • #26
                        Originally posted by schmidtbag View Post
                        While optimus is a feature advertised by nvidia hardware (and therefore should be available to all customers), it's more of a privilege than anything.
                        Optimus has been advertised as a feature for Windows 7 specifically: http://www.nvidia.com/object/optimus_technology.html
                        "* Optimus requires Windows 7 or later"
                        XP/Vista users don't get that feature either (because the OS lacks support), so I'm not sure why linux users felt they were entitled to the functionality in the first place. nVidia tried to offer some kernel patches to enable Optimus in the past, but these were not accepted. Now they are using an alternative approach using a feature in RandR 1.4 (which wasn't available at the time Optimus was introduced).

                        Comment


                        • #27
                          Optimus support makes Nvidia more appealing to Linux laptop gamers.

                          I've avoided Nvidia laptops due to Optimus not working on Linux.
                          So Optimus coming to Linux is great.
                          Though, when I buy next laptop, it wont have a Nvidia card anyways, because it just draws more power than a Intel or AMD CPU with integrated GPU.

                          I would like to see some open source commitment from Nvidia.
                          Also, I would like to see EGL, OpenGL ES, Wayland support.

                          Comment


                          • #28
                            Originally posted by leonmaxx View Post
                            Will nVidia enable overclocking on Linux? Any news on this?
                            linux has been able to overlclock nvidia cards for a while now.... IIRC, even the open source drivers can do it at this point. I think you're also able to change fan speeds. I'm not sure if all GPUs are supported for either ability. I'm pretty sure most ATI and AMD cards can be overclocked too. Intel was the only one that lacked overclocking support up until very recently.

                            Comment


                            • #29
                              Originally posted by schmidtbag View Post
                              linux has been able to overlclock nvidia cards for a while now....
                              You mean only cards older than GTX4xx, right?

                              Comment


                              • #30
                                Has anyone here tried it? Setting up the new driver with the opensource intel one and seeing what does and does not work?

                                Comment

                                Working...
                                X