GNOME Mutter Lands Improved GPU Selection Logic For Laptops

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • phoronix
    Administrator
    • Jan 2007
    • 67377

    GNOME Mutter Lands Improved GPU Selection Logic For Laptops

    Phoronix: GNOME Mutter Lands Improved GPU Selection Logic For Laptops

    Merged today to GNOME's Mutter compositor is improved logic for selecting the graphics processor to treat as the primary one within multi-GPU laptops...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite
  • hf_139
    Senior Member
    • May 2023
    • 339

    #2
    I simply wrote a primus-run script that also starts / stops the discrete GPU by echoing crap into /sys/bus/pci/devices/... whenever an application is launched with it.

    Stopping is: unbind the driver, remove the device and set the corresponding pci controller to power/control "auto" and modprobe -r the nvidia modules.
    Starting is: Just set the pci controller to power/control "on" again and echo 1 to rescan.

    Wayland made it possible for me to do this, because xorg kept "using" the card and unbinding the driver would not work without logging out.
    It's the only way for me to keep the battery alive. All those default "energy saving" features that should throttle the card when not used, are crap. Only when it is actually powered off and removed, it doesn't drain the battery anymore.

    The downside is that i have to battle those default installed power management services like "power-profile-daemon" and "tuned" (both of them permanently running services written in python by RedHat), whenever a distributions sneaks those in, i have to remove them. Only tlp works great together without messing things up.

    Comment

    • uid313
      Senior Member
      • Dec 2011
      • 6922

      #3
      Is there a future for multi-GPU on laptops?

      Multi-GPU is a feature that has been used on x86 laptops with Intel and AMD processors but now with ARM laptops maybe they won't have a dedicated GPU. I don't think Macbooks have a integrated and dedicated GPU in the same way as many Intel and AMD laptops does, and now Qualcomm are making CPUs for laptops too, and I don't think Snapdragon laptops have a integrated and a dedicated GPU either.

      Comment

      • skerit
        Junior Member
        • Dec 2016
        • 32

        #4
        Originally posted by uid313 View Post
        Is there a future for multi-GPU on laptops?

        Multi-GPU is a feature that has been used on x86 laptops with Intel and AMD processors but now with ARM laptops maybe they won't have a dedicated GPU. I don't think Macbooks have a integrated and dedicated GPU in the same way as many Intel and AMD laptops does, and now Qualcomm are making CPUs for laptops too, and I don't think Snapdragon laptops have a integrated and a dedicated GPU either.
        I have a Framework 16 laptop with the AMD GPU bay installed *and* I use an eGPU at home, so I have 3 GPUs at time in this thing.

        Getting gnome to use the right one is quite annoying. I hope this fix will help!

        Comment

        • johnandmegh
          Junior Member
          • Mar 2023
          • 25

          #5
          Originally posted by uid313 View Post
          Is there a future for multi-GPU on laptops?

          Multi-GPU is a feature that has been used on x86 laptops with Intel and AMD processors but now with ARM laptops maybe they won't have a dedicated GPU. I don't think Macbooks have a integrated and dedicated GPU in the same way as many Intel and AMD laptops does, and now Qualcomm are making CPUs for laptops too, and I don't think Snapdragon laptops have a integrated and a dedicated GPU either.
          I suppose it'd be tied to whether one thinks there's a future for the amd64 architecture among the target market for GPU-intensive tasks like hardcore gaming? (I say hardcore because my daughter can play Minecraft, Roblox, etc. on our laptop just fine without even using the Nvidia dGPU, just the AMD Ryzen integrated graphics)

          Even if ARM-based systems start eating into some of the growth, I still imagine work like this helps a lot of newly produced devices, and a ton of existing devices out in the wild, work better.

          Comment

          • QwertyChouskie
            Senior Member
            • Nov 2017
            • 638

            #6
            Originally posted by hf_139 View Post
            I simply wrote a primus-run script that also starts / stops the discrete GPU by echoing crap into /sys/bus/pci/devices/... whenever an application is launched with it.

            Stopping is: unbind the driver, remove the device and set the corresponding pci controller to power/control "auto" and modprobe -r the nvidia modules.
            Starting is: Just set the pci controller to power/control "on" again and echo 1 to rescan.

            Wayland made it possible for me to do this, because xorg kept "using" the card and unbinding the driver would not work without logging out.
            It's the only way for me to keep the battery alive. All those default "energy saving" features that should throttle the card when not used, are crap. Only when it is actually powered off and removed, it doesn't drain the battery anymore.

            The downside is that i have to battle those default installed power management services like "power-profile-daemon" and "tuned" (both of them permanently running services written in python by RedHat), whenever a distributions sneaks those in, i have to remove them. Only tlp works great together without messing things up.
            https://wiki.archlinux.org/title/ASU...r_optimization is the actual fix, with this set powering up/down the GPU should be automatic. Might not work with some very old Optimus systems, but probably 2000 series and definitely 3000 series should Just Work in my experience.

            Comment

            • emansom
              Senior Member
              • Jun 2015
              • 138

              #7
              And it still defaults to using the iGPU on desktop Ryzen 7000 series and above, great job GNOME team. /s

              Comment

              • stormcrow
                Senior Member
                • Jul 2017
                • 1518

                #8
                Originally posted by uid313 View Post
                Is there a future for multi-GPU on laptops?

                Multi-GPU is a feature that has been used on x86 laptops with Intel and AMD processors but now with ARM laptops maybe they won't have a dedicated GPU. I don't think Macbooks have a integrated and dedicated GPU in the same way as many Intel and AMD laptops does, and now Qualcomm are making CPUs for laptops too, and I don't think Snapdragon laptops have a integrated and a dedicated GPU either.
                Won't be any different with Arm laptops as it will x86. The reasons for having dedicated GPUs rather than iGPU/APUs is the same regardless of what the system architecture is. The only thing that changes is sometimes it becomes possible the integrated chip is performant enough to not need a dedicated one in some cases.


                Future systems will be a trade off between power consumption and performance characteristics. So a lot of the "omg that sucks because X is 3% SLOWERS!!!!111" is ignoring that yeah, it might be slower but it takes %50 less power to perform the same task yet the difference in execution time is negligible. That's the real lesson Apple is still teaching people. Yeah it'll never get through the performance at any and all costs boneheads, but most people aren't like that. Apple put the "lap" back into "laptop".

                Edit to add: Also willing to bet their self-built data centers likely run substantially cooler than centers with Intel & AMD processors.
                Last edited by stormcrow; 12 November 2024, 07:32 PM.

                Comment

                • Yndoendo
                  Phoronix Member
                  • Oct 2015
                  • 102

                  #9
                  dGPU vs iGPU is kind of broken with XDG and the .desktop key is poorly named as PreferesNonDefaultGPU. Steam even has a bug about the incorrect GPU being selection to run their client. Had to remove the PreferesNonDefaultGPU setting from the .desktop in order for the client to startup properly. Actually get better performance running in Hybrid vs MUX. https://github.com/ValveSoftware/ste...ux/issues/9940

                  Comment

                  • mos87
                    Senior Member
                    • Sep 2016
                    • 463

                    #10
                    Vertically integrated stack unusable by anyone who cannot stomach the defective by design abomination that is g-shell (and/or depends on X11 it seems)

                    Comment

                    Working...
                    X