NVIDIA 435.17 Linux Beta Driver Adds Vulkan + OpenGL PRIME Render Offload

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • Vash63
    Senior Member
    • May 2008
    • 206

    #11
    Originally posted by LinAGKar View Post
    It's about damn time. It's been seven years since Linus gave them the finger.
    Wasn't that in the context of Tegra though, which has actually been in good shape for a long time now?

    Comment

    • Floturcocantsee
      Junior Member
      • Jun 2018
      • 13

      #12
      Just got it up and running on my XPS 15 9570 running Pop 19.04. Works flawlessly, the driver is even smart enough to control the power state so that you don't need to fiddle around with bbswitch to manually unload the card when not in use.

      Comment

      • darinmiller
        Junior Member
        • Mar 2009
        • 23

        #13
        Has anyone ran a native steam game with this driver?

        Both of the examples run great on 19.10 using the NVidia On-Demand profile:

        Code:
        _PRIME_RENDER_OFFLOAD=1 __VK_LAYER_NV_optimus=NVIDIA_only vkcube
        __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia glxgears
        But adding "__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia" to the launch params of CS:Source only runs with intel GPU (30fps=intel, 100fps=nvidia nvidia960 at 4K).

        Comment

        • DebianLinuxero
          Senior Member
          • Mar 2012
          • 207

          #14
          This PRIME offloading is about using one GPU for display but having the actual rendering be done on a secondary GPU, as is common with many of today's high-end notebooks that have Intel integrated graphics paired with a discrete NVIDIA GPU.
          I wonder if this will work too on my laptop with AMD Vega 8 (Ryzen 2500U) and Geforce GTX 1050.

          Comment

          • Britoid
            Senior Member
            • Jul 2013
            • 2175

            #15
            Originally posted by LinAGKar View Post
            It's about damn time. It's been seven years since Linus gave them the finger.



            In that case you'll have to run X on the Nvidia GPU, and use display offload onto the Intel GPU for the internal display (reverse prime).
            Didn't recent patches by Red Hat help automatically set this up?

            Comment

            • royce
              Senior Member
              • Aug 2018
              • 660

              #16
              Originally posted by Floturcocantsee View Post
              Just got it up and running on my XPS 15 9570 running Pop 19.04. Works flawlessly, the driver is even smart enough to control the power state so that you don't need to fiddle around with bbswitch to manually unload the card when not in use.
              Where did you grab the package from? And how about the patched xorg?

              Edit: found them
              This PPA contains versions of the X.Org X server built with the patches needed to use PRIME render offload on the NVIDIA driver. See https://people.freedesktop.org/~aplattner/nvidia-readme/primerenderoffload.html for more details.

              Fresh drivers from upstream, currently shipping Nvidia. ## Current releases Current production branch release: 550.107.02 Current new feature branch release: 555.58.02 Current beta release: 560.28.03 ## Legacy releases 470.256.02 (x86_64) - GKxxx “Kepler” GPUs 390.157 (x86 / x86_64 / ARM) - GF1xx “Fermi” GPUs (*​) 340.108 (x86 / x86_64) - GeForce 8 and 9 series GPUs (*​) 304.137 (x86 / x86_64) - GeForce 6 and 7 series GPUs (*​) 173.14.39 (x86 / x86_64) - GeForce 5 series GPUs (*​) 96.43.2...
              Last edited by royce; 14 August 2019, 06:07 AM.

              Comment

              • kmare
                Senior Member
                • Oct 2013
                • 188

                #17
                Originally posted by Floturcocantsee View Post
                Just got it up and running on my XPS 15 9570 running Pop 19.04. Works flawlessly, the driver is even smart enough to control the power state so that you don't need to fiddle around with bbswitch to manually unload the card when not in use.
                Thank you for reporting back! Are you sure the nvidia gfx is actually turned off when not in use? I just read a comment here:
                As per aplattner’s suggestion from https://devtalk.nvidia.com/default/topic/957814/linux/prime-and-prime-synchronization/post/4953175/#4953175, I’ve taken the liberty to open a new thread for discussing on PRIME GPU render offload feature on Optimus-based hardware. As you may know, Nvidia’s current official support only allows GPU “Output” instead GPU “Offload” may be unsatisfactory as it translates into higher power consumption and heat production in laptops. I did suggest in the PRIME and P...

                and apparently for pre-Turing it won't turn off and basically making it kinda pointless for such laptops. Can you confirm this is the case?

                PS: if anyone knows about a COPR for fedora, please share

                Thanks!

                Comment

                • Floturcocantsee
                  Junior Member
                  • Jun 2018
                  • 13

                  #18
                  Originally posted by kmare View Post

                  Thank you for reporting back! Are you sure the nvidia gfx is actually turned off when not in use? I just read a comment here:
                  As per aplattner’s suggestion from https://devtalk.nvidia.com/default/topic/957814/linux/prime-and-prime-synchronization/post/4953175/#4953175, I’ve taken the liberty to open a new thread for discussing on PRIME GPU render offload feature on Optimus-based hardware. As you may know, Nvidia’s current official support only allows GPU “Output” instead GPU “Offload” may be unsatisfactory as it translates into higher power consumption and heat production in laptops. I did suggest in the PRIME and P...

                  and apparently for pre-Turing it won't turn off and basically making it kinda pointless for such laptops. Can you confirm this is the case?

                  PS: if anyone knows about a COPR for fedora, please share

                  Thanks!
                  Currently it will put the GPU into P5 (2Watt idle with GPU decoding on demand) if it's not turing. Still on the 97 watt xps 15 that's the difference between 7-8 hours and 6-7 at the worst.

                  Comment

                  • scorpionita
                    Junior Member
                    • Aug 2019
                    • 2

                    #19
                    Originally posted by kmare View Post

                    Thank you for reporting back! Are you sure the nvidia gfx is actually turned off when not in use? I just read a comment here:
                    As per aplattner’s suggestion from https://devtalk.nvidia.com/default/topic/957814/linux/prime-and-prime-synchronization/post/4953175/#4953175, I’ve taken the liberty to open a new thread for discussing on PRIME GPU render offload feature on Optimus-based hardware. As you may know, Nvidia’s current official support only allows GPU “Output” instead GPU “Offload” may be unsatisfactory as it translates into higher power consumption and heat production in laptops. I did suggest in the PRIME and P...

                    and apparently for pre-Turing it won't turn off and basically making it kinda pointless for such laptops. Can you confirm this is the case?

                    PS: if anyone knows about a COPR for fedora, please share

                    Thanks!
                    COPR Working In Progress https://copr.fedorainfracloud.org/co..._nvidia_prime/

                    Comment

                    • scorpionita
                      Junior Member
                      • Aug 2019
                      • 2

                      #20
                      Originally posted by darinmiller View Post
                      Has anyone ran a native steam game with this driver?

                      Both of the examples run great on 19.10 using the NVidia On-Demand profile:

                      Code:
                      _PRIME_RENDER_OFFLOAD=1 __VK_LAYER_NV_optimus=NVIDIA_only vkcube
                      __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia glxgears
                      But adding "__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia" to the launch params of CS:Source only runs with intel GPU (30fps=intel, 100fps=nvidia nvidia960 at 4K).
                      Tested on fedora 30 (COPR url in my previous post in review) and work well, steam included.

                      PS: to work with steam you need to add "__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia %command%" to the launch parameter

                      Comment

                      Working...
                      X