Announcement

Collapse
No announcement yet.

NVIDIA 450 Linux Beta Driver Quietly Rolls Out With New PRIME Option, Other Improvements

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Envidioso
    replied
    Originally posted by Phiox View Post
    Does this finally allow using displays connected to the Nvidia GPU while using Render Offloading? This has been an issue for some time (https://forums.developer.nvidia.com/...ad-mode/107046). This is a deal breaker for me since I want my display outputs to 'y know work 'n stuff.
    Yes - it does! Using it on Arch with optimus-manager in Hybrid mode and the latest Xorg shows the HDMI/DP connected displays! It's a bit laggy for me when connecting a Thunderbolt display, though, but seems to work fine with an HDMI one.

    For laptops that are condemned to use the Nvidia driver on Linux this could be a battery lifesaver, so we can use Intel and keep Nvidia only for plugging external screens while on the go, without having to restart Xorg to fully render on Nvidia.

    Leave a comment:


  • IreMinMon
    replied
    Originally posted by mdedetrich View Post
    Have no idea what this is
    https://www.anandtech.com/show/15692...vanced-optimus
    Nvm, apparently it's available on "super max-Q" GPUs only. It's basically nvidia optimus, with some fancy switch which allows the display to be connected either directly to iGPU or dGPU. dGPU no longer offloads the frame on to the iGPU buffer to be sent to the display. This reduces latency and allows for G-sync to be used.

    Originally posted by mdedetrich View Post
    Well I checked the power usage of my laptop and its lower if I am not using a heavy GPU application piped through PRIME. If I do use an application through PRIME than power use goes up. According to their docs at https://download.nvidia.com/XFree86/...eroffload.html its meant to work
    https://www.anandtech.com/show/15692...vanced-optimus
    Again apparently nvidia allows manufacturers to set a thermal output limit in BIOS that represents the laptop's cooling ability. You can then include a GPU and a CPU that the system wouldn't be able to cool when running together at full power, but this new technology apparently cleverly shifts power from one to another in order to eliminate the bottleneck.

    Originally posted by mdedetrich View Post
    I remember reading this earlier but it turns out this is not the case. Both the NVidia docs and Archwiki (see https://wiki.archlinux.org/index.php/NVIDIA_Optimus) says that it actually powers off the GPU. I also saw https://forums.developer.nvidia.com/...-dgpu/113054/2 .

    So I may have been wrong here although I swear reading somewhere that PRIME on Linux couldn't completely turn off the GPU but this was some time ago.
    Okay, thanks

    Leave a comment:


  • mdedetrich
    replied
    Originally posted by IreMinMon View Post

    Does this new "advanced muxless optimus" thingy work though?
    Have no idea what this is

    Originally posted by IreMinMon View Post
    Also "dynamic power allocation" thingy or whatever it's called?
    Well I checked the power usage of my laptop and its lower if I am not using a heavy GPU application piped through PRIME. If I do use an application through PRIME than power use goes up. According to their docs at https://download.nvidia.com/XFree86/...eroffload.html its meant to work

    PRIME render offload is the ability to have an X screen rendered by one GPU, but choose certain applications within that X screen to be rendered on a different GPU. This is particularly useful in combination with dynamic power management to leave an NVIDIA GPU powered off, except when it is needed to render select performance-sensitive applications.
    Originally posted by IreMinMon View Post
    I can't seem to find official statements from nvidia on this stuff. Also do you have any source on the statement that full dGPU poweroff is unavailable due to kernel limitations?
    I remember reading this earlier but it turns out this is not the case. Both the NVidia docs and Archwiki (see https://wiki.archlinux.org/index.php/NVIDIA_Optimus) says that it actually powers off the GPU. I also saw https://forums.developer.nvidia.com/...-dgpu/113054/2 .

    So I may have been wrong here although I swear reading somewhere that PRIME on Linux couldn't completely turn off the GPU but this was some time ago.
    Last edited by mdedetrich; 06-08-2020, 10:56 AM.

    Leave a comment:


  • IreMinMon
    replied
    Originally posted by mdedetrich View Post

    I have been using NVdia Prime Offload and it does work as advertised, there are some caveats though

    1. NVidia Prime Offload cannot turn off the GPU completely like it can in Windows however this is apparently due to a Linux kernel limitation. Note that even though Prime Render Offload doesn't completely turn off the GPU it still uses a lot less power when its not being used (~5 watts) so its not pointless (it achieves its goal of using a lot less power compared to having both GPU and CPU constantly on).
    2. You have to explicitly specify what applications you want the GPU to use by setting an environment variable as you launch that application. In other words its manual unlike Windows (although Bumblebee was the exact same). I think this may be more of a Linux problem than an NVidia problem because afaik there is no way in Linux to figure out "ss this application demanding for a GPU". Might make sense for freedesktop to add a flag for `.desktop` files which specify if this application is "GPU demanding" and handle it that way?

    Also GPU video acceleration (VDPAU) works fine, again you just need to enable prime render offload when running applications like chrome/firefox/brave/vlc etc etc.
    Does this new "advanced muxless optimus" thingy work though? Also "dynamic power allocation" thingy or whatever it's called?
    I can't seem to find official statements from nvidia on this stuff. Also do you have any source on the statement that full dGPU poweroff is unavailable due to kernel limitations?

    Leave a comment:


  • Phiox
    replied
    Does this finally allow using displays connected to the Nvidia GPU while using Render Offloading? This has been an issue for some time (https://forums.developer.nvidia.com/...ad-mode/107046). This is a deal breaker for me since I want my display outputs to 'y know work 'n stuff.

    Leave a comment:


  • starshipeleven
    replied
    aaand unapproved post for abott above this

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by abott View Post
    I've not tried to run Linux on it as Nvidia was always terrible when I demo'd it. Does their offloading work with DRI now? Is it good enough to actually make my laptop usable with Linux? I'd love to move off of Win10 on my laptop, but Nvidia power consumption was always the blocker.
    On my work laptop I don't really need a crappy NVIDIA "dedicated" GPU anyway (and I say crappy because it is a 930MX, that is indeed trash that is barely better than Intel iGPU) so I just disable it from UEFI setup.

    Leave a comment:


  • sandy8925
    replied
    Originally posted by dammarin View Post

    Oh yes, please. I _almost_ got an AMD card to play HL Alyx the other week, luckily I got it working almost perfect on my 980Ti, even if on seriously reduced settings.

    Unlike many, my experiences with Nvidia have been mostly positive. I realise they don't play nice with the open source community but they provide solid support (for things they decide they want to provide support for). If there's a problem and they fix it, I just install a newer driver and that's that.

    Unlike AMD, where I have to upgrade half the system and hope a newer kernel and/or mesa and whatever else is the one that'll work. I have an AMD-based machine from 7 years ago that's still unusable in Linux because of constant GPU crashes. And just last week I got a lappy with a Ryzen 3500U, installed Mint 19.3 on it only to get almost 100% reproducible hangs on logouts and shutdowns. The Ryzen came out at the start of 2019, kernel 5.3 came out in September and still the support wasn't there. I had to manually install 5.6 and for now it looks fixed. Anyway, that's AMD for you.
    Yeah, AMD has been problematic for me too. I'm looking forward to Intel Xe dGPU, hopefully it will be great and I can finally ditch AMD GPU.

    Leave a comment:


  • sandy8925
    replied
    Originally posted by vb_linux View Post

    Genuine question. Does AMD has a prime like functionality, since it is integrated and all?
    Dunno, but I did run Warhammer 2 off the AMD GPU, while desktop was rendered using the Intel GPU, by using the Vulkan selection dropdown in Warhammer's launcher. I don't know if it uses PRIME or not, but it worked great.

    Leave a comment:


  • sandy8925
    replied
    Originally posted by shmerl View Post
    Nvidia catching up on PRIME support, decades late. The benefits of the blob.
    Wayland support, will come when Wayland has been replaced with something else. At that point, NVIDIA fans will say "Why use new thing instead of Wayland? Wayland is great! It has network transparency, screen sharing etc. while new thing doesn't. New thing will never take off. We NVIDIA users (tiny minority) must prevent good things for everyone else."

    Leave a comment:

Working...
X