Announcement

Collapse
No announcement yet.

NVIDIA 450 Linux Beta Driver Quietly Rolls Out With New PRIME Option, Other Improvements

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by shmerl View Post

    Should we care how? Nvidia couldn't implement PRIME for ages, since they refused to upstream their driver. Their reasons is not something I really care about. A blob is a blob and lack of integration with upstream is totally expected from it.
    Genuine question. Does AMD has a prime like functionality, since it is integrated and all?

    Comment


    • #12
      Originally posted by vb_linux View Post

      Genuine question. Does AMD has a prime like functionality, since it is integrated and all?
      AMD supports PRIME, same as Intel, because they are upstreamed and PRIME is a standard kernel feature.

      Comment


      • #13
        Originally posted by spykes View Post
        Hopefully this driver series will also add the Vulkan bits needed for Valve to support async reprojection 🙏
        Oh yes, please. I _almost_ got an AMD card to play HL Alyx the other week, luckily I got it working almost perfect on my 980Ti, even if on seriously reduced settings.

        Unlike many, my experiences with Nvidia have been mostly positive. I realise they don't play nice with the open source community but they provide solid support (for things they decide they want to provide support for). If there's a problem and they fix it, I just install a newer driver and that's that.

        Unlike AMD, where I have to upgrade half the system and hope a newer kernel and/or mesa and whatever else is the one that'll work. I have an AMD-based machine from 7 years ago that's still unusable in Linux because of constant GPU crashes. And just last week I got a lappy with a Ryzen 3500U, installed Mint 19.3 on it only to get almost 100% reproducible hangs on logouts and shutdowns. The Ryzen came out at the start of 2019, kernel 5.3 came out in September and still the support wasn't there. I had to manually install 5.6 and for now it looks fixed. Anyway, that's AMD for you.

        Comment


        • #14
          Originally posted by shmerl View Post
          Nvidia catching up on PRIME support, decades late. The benefits of the blob.
          Wayland support, will come when Wayland has been replaced with something else. At that point, NVIDIA fans will say "Why use new thing instead of Wayland? Wayland is great! It has network transparency, screen sharing etc. while new thing doesn't. New thing will never take off. We NVIDIA users (tiny minority) must prevent good things for everyone else."

          Comment


          • #15
            Originally posted by vb_linux View Post

            Genuine question. Does AMD has a prime like functionality, since it is integrated and all?
            Dunno, but I did run Warhammer 2 off the AMD GPU, while desktop was rendered using the Intel GPU, by using the Vulkan selection dropdown in Warhammer's launcher. I don't know if it uses PRIME or not, but it worked great.

            Comment


            • #16
              Originally posted by dammarin View Post

              Oh yes, please. I _almost_ got an AMD card to play HL Alyx the other week, luckily I got it working almost perfect on my 980Ti, even if on seriously reduced settings.

              Unlike many, my experiences with Nvidia have been mostly positive. I realise they don't play nice with the open source community but they provide solid support (for things they decide they want to provide support for). If there's a problem and they fix it, I just install a newer driver and that's that.

              Unlike AMD, where I have to upgrade half the system and hope a newer kernel and/or mesa and whatever else is the one that'll work. I have an AMD-based machine from 7 years ago that's still unusable in Linux because of constant GPU crashes. And just last week I got a lappy with a Ryzen 3500U, installed Mint 19.3 on it only to get almost 100% reproducible hangs on logouts and shutdowns. The Ryzen came out at the start of 2019, kernel 5.3 came out in September and still the support wasn't there. I had to manually install 5.6 and for now it looks fixed. Anyway, that's AMD for you.
              Yeah, AMD has been problematic for me too. I'm looking forward to Intel Xe dGPU, hopefully it will be great and I can finally ditch AMD GPU.

              Comment


              • #17
                Originally posted by abott View Post
                I've not tried to run Linux on it as Nvidia was always terrible when I demo'd it. Does their offloading work with DRI now? Is it good enough to actually make my laptop usable with Linux? I'd love to move off of Win10 on my laptop, but Nvidia power consumption was always the blocker.
                On my work laptop I don't really need a crappy NVIDIA "dedicated" GPU anyway (and I say crappy because it is a 930MX, that is indeed trash that is barely better than Intel iGPU) so I just disable it from UEFI setup.

                Comment


                • #18
                  aaand unapproved post for abott above this

                  Comment


                  • #19
                    Does this finally allow using displays connected to the Nvidia GPU while using Render Offloading? This has been an issue for some time (https://forums.developer.nvidia.com/...ad-mode/107046). This is a deal breaker for me since I want my display outputs to 'y know work 'n stuff.

                    Comment


                    • #20
                      Originally posted by mdedetrich View Post

                      I have been using NVdia Prime Offload and it does work as advertised, there are some caveats though

                      1. NVidia Prime Offload cannot turn off the GPU completely like it can in Windows however this is apparently due to a Linux kernel limitation. Note that even though Prime Render Offload doesn't completely turn off the GPU it still uses a lot less power when its not being used (~5 watts) so its not pointless (it achieves its goal of using a lot less power compared to having both GPU and CPU constantly on).
                      2. You have to explicitly specify what applications you want the GPU to use by setting an environment variable as you launch that application. In other words its manual unlike Windows (although Bumblebee was the exact same). I think this may be more of a Linux problem than an NVidia problem because afaik there is no way in Linux to figure out "ss this application demanding for a GPU". Might make sense for freedesktop to add a flag for `.desktop` files which specify if this application is "GPU demanding" and handle it that way?

                      Also GPU video acceleration (VDPAU) works fine, again you just need to enable prime render offload when running applications like chrome/firefox/brave/vlc etc etc.
                      Does this new "advanced muxless optimus" thingy work though? Also "dynamic power allocation" thingy or whatever it's called?
                      I can't seem to find official statements from nvidia on this stuff. Also do you have any source on the statement that full dGPU poweroff is unavailable due to kernel limitations?

                      Comment

                      Working...
                      X