Announcement

Collapse
No announcement yet.

Primus-VK: PRIME-Style GPU Offloading For Vulkan

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Primus-VK: PRIME-Style GPU Offloading For Vulkan

    Phoronix: Primus-VK: PRIME-Style GPU Offloading For Vulkan

    For those with a PRIME style notebook or just making use of dual/multiple graphics processors in your system, Primus-VK allows for using a secondary/dedicated GPU for rendering while driving the display from the alternative (often integrated graphics) GPU. Primus-VK is implemented as a Vulkan layer as a clean approach for dealing with multiple GPUs in a Vulkan world...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    not sure I understand this news but so far I have best experience under Manjaro with bumblebee driver so Intel is used and nVidia for blender/render or some 3d game tests, this seems interesting and it's something to consider for some distros maintainers I guess for future stock packages delivery

    Comment


    • #3
      I've been offloading to my AMD GPU for years on my laptop, I'm guessing this is an Nvidia solution

      Comment


      • #4
        Originally posted by FireBurn View Post
        I've been offloading to my AMD GPU for years on my laptop, I'm guessing this is an Nvidia solution
        Yeah, AFAIU this is for NVidia. I guess that for proper support, NVidia will have to implement some stuff (display server and drivers) and it's taking a while (for Xorg, no clue what their plans for Wayland are).
        As per aplattner’s suggestion from https://devtalk.nvidia.com/default/topic/957814/linux/prime-and-prime-synchronization/post/4953175/#4953175, I’ve taken the liberty to open a new thread for discussing on PRIME GPU render offload feature on Optimus-based hardware. As you may know, Nvidia’s current official support only allows GPU “Output” instead GPU “Offload” may be unsatisfactory as it translates into higher power consumption and heat production in laptops. I did suggest in the PRIME and P...


        My next hardware is certainly going to be AMD. Not gonna go through all this mess again. I'm not really blaming the developers behind the nvidia drivers, but the management/legal staff is really doing a bad job.

        Comment


        • #5
          but at least in nearly all instances should still be better off than being stuck with the compute power of integrated graphics.
          That was only a problem for Bumblebee though. Nvidia PRIME doesn't have that problem.

          From that sentence , i got the impression Michael still thinks Bumblee is / was the only solution for Optimus laptops.

          Comment


          • #6
            Originally posted by Leopard View Post

            That was only a problem for Bumblebee though. Nvidia PRIME doesn't have that problem.

            From that sentence , i got the impression Michael still thinks Bumblee is / was the only solution for Optimus laptops.
            If you want to use the blob and don't want to have the eGPU active without restarting X11 than yes.

            Comment


            • #7
              Originally posted by Thaodan View Post

              If you want to use the blob and don't want to have the eGPU active without restarting X11 than yes.
              Though Bumblee always had overhead compared to PRIME , always will.

              Is it really worth it? I mean restarting the session is not that hard you know...

              Comment


              • #8
                Originally posted by Leopard View Post

                That was only a problem for Bumblebee though. Nvidia PRIME doesn't have that problem.

                From that sentence , i got the impression Michael still thinks Bumblee is / was the only solution for Optimus laptops.
                Please, someone enlighten me.

                I have a Haswell notebook that comes with an NVidia GT 730M. (Thinkpad T540p)

                The rig runs Fedora 29.

                How do I properly use Vulkan (and OpenGL) on the dedicated chip?

                It's a mess from my experience.
                Hard to think of something more user unfriendly as this.

                I basically gave up on the dedicated GPU for a long time.
                (On Win10 it's working fine by the way.)
                So many games are not running at all on the Intel Graphics 4000 unit
                or it's like the notebook will melt any minute.

                Muxed/Non-Muxed external chips? Nvidia?
                As of now: Never, ... NEVER(!) again.
                Next Notebook will be some AMD APU.
                Seems to be the best catch for the money and I bet it works far better.

                Nevertheless, if someone can point me to a howto(*) that works for Fedora 29+,
                I will try this for very last time.

                Thanks!

                Edit: (*) That doesn't require me to be a senior system engineer.

                Comment


                • #9
                  Originally posted by entropy View Post

                  Please, someone enlighten me.

                  I have a Haswell notebook that comes with an NVidia GT 730M. (Thinkpad T540p)

                  The rig runs Fedora 29.

                  How do I properly use Vulkan (and OpenGL) on the dedicated chip?

                  It's a mess from my experience.
                  Hard to think of something more user unfriendly as this.

                  I basically gave up on the dedicated GPU for a long time.
                  (On Win10 it's working fine by the way.)
                  So many games are not running at all on the Intel Graphics 4000 unit
                  or it's like the notebook will melt any minute.

                  Muxed/Non-Muxed external chips? Nvidia?
                  As of now: Never, ... NEVER(!) again.
                  Next Notebook will be some AMD APU.
                  Seems to be the best catch for the money and I bet it works far better.

                  Nevertheless, if someone can point me to a howto(*) that works for Fedora 29+,
                  I will try this for very last time.

                  Thanks!

                  Edit: (*) That doesn't require me to be a senior system engineer.


                  With this.

                  I personally use it.

                  Comment


                  • #10
                    Thank you, Leopard!

                    As much as I appreciate your detailed HowTo, do you consider this something anywhere near user friendly?
                    Don't take this as criticism towards your post/work but how that technology Optimus (etc...) is currently implemented on Linux.

                    Also I guess cannot simply use this as recipe on Fedora. :/
                    I would try this. But I have neither much time like used to have years ago where it would've
                    been fun to "play" around with this nor do I want to risk to mess up my system.

                    Comment

                    Working...
                    X