Announcement

Collapse
No announcement yet.

NVIDIA 450 Linux Beta Driver Quietly Rolls Out With New PRIME Option, Other Improvements

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by IreMinMon View Post

    Does this new "advanced muxless optimus" thingy work though?
    Have no idea what this is

    Originally posted by IreMinMon View Post
    Also "dynamic power allocation" thingy or whatever it's called?
    Well I checked the power usage of my laptop and its lower if I am not using a heavy GPU application piped through PRIME. If I do use an application through PRIME than power use goes up. According to their docs at https://download.nvidia.com/XFree86/...eroffload.html its meant to work

    PRIME render offload is the ability to have an X screen rendered by one GPU, but choose certain applications within that X screen to be rendered on a different GPU. This is particularly useful in combination with dynamic power management to leave an NVIDIA GPU powered off, except when it is needed to render select performance-sensitive applications.
    Originally posted by IreMinMon View Post
    I can't seem to find official statements from nvidia on this stuff. Also do you have any source on the statement that full dGPU poweroff is unavailable due to kernel limitations?
    I remember reading this earlier but it turns out this is not the case. Both the NVidia docs and Archwiki (see https://wiki.archlinux.org/index.php/NVIDIA_Optimus) says that it actually powers off the GPU. I also saw https://forums.developer.nvidia.com/...-dgpu/113054/2 .

    So I may have been wrong here although I swear reading somewhere that PRIME on Linux couldn't completely turn off the GPU but this was some time ago.
    Last edited by mdedetrich; 08 June 2020, 10:56 AM.

    Comment


    • #22
      Originally posted by mdedetrich View Post
      Have no idea what this is

      Nvm, apparently it's available on "super max-Q" GPUs only. It's basically nvidia optimus, with some fancy switch which allows the display to be connected either directly to iGPU or dGPU. dGPU no longer offloads the frame on to the iGPU buffer to be sent to the display. This reduces latency and allows for G-sync to be used.

      Originally posted by mdedetrich View Post
      Well I checked the power usage of my laptop and its lower if I am not using a heavy GPU application piped through PRIME. If I do use an application through PRIME than power use goes up. According to their docs at https://download.nvidia.com/XFree86/...eroffload.html its meant to work

      Again apparently nvidia allows manufacturers to set a thermal output limit in BIOS that represents the laptop's cooling ability. You can then include a GPU and a CPU that the system wouldn't be able to cool when running together at full power, but this new technology apparently cleverly shifts power from one to another in order to eliminate the bottleneck.

      Originally posted by mdedetrich View Post
      I remember reading this earlier but it turns out this is not the case. Both the NVidia docs and Archwiki (see https://wiki.archlinux.org/index.php/NVIDIA_Optimus) says that it actually powers off the GPU. I also saw https://forums.developer.nvidia.com/...-dgpu/113054/2 .

      So I may have been wrong here although I swear reading somewhere that PRIME on Linux couldn't completely turn off the GPU but this was some time ago.
      Okay, thanks

      Comment


      • #23
        Originally posted by Phiox View Post
        Does this finally allow using displays connected to the Nvidia GPU while using Render Offloading? This has been an issue for some time (https://forums.developer.nvidia.com/...ad-mode/107046). This is a deal breaker for me since I want my display outputs to 'y know work 'n stuff.
        Yes - it does! Using it on Arch with optimus-manager in Hybrid mode and the latest Xorg shows the HDMI/DP connected displays! It's a bit laggy for me when connecting a Thunderbolt display, though, but seems to work fine with an HDMI one.

        For laptops that are condemned to use the Nvidia driver on Linux this could be a battery lifesaver, so we can use Intel and keep Nvidia only for plugging external screens while on the go, without having to restart Xorg to fully render on Nvidia.

        Comment

        Working...
        X