Announcement

Collapse
No announcement yet.

AMD Linux Graphics Driver To Better Handle Power Savings During Compute Workloads

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • AMD Linux Graphics Driver To Better Handle Power Savings During Compute Workloads

    Phoronix: AMD Linux Graphics Driver To Better Handle Power Savings During Compute Workloads

    Over the past week have been two patch series in working to enable BACO (Bus Active, Chip Off) support and in turn power management capabilities when using AMDKFD (Kernel Fusion Driver) for compute workloads...

    http://www.phoronix.com/scan.php?pag...-Power-Savings

  • #2
    I'm still hoping someday they manage to have multi monitors without clocking the vram to the highest level. On my 5700xt that yields rising temperatures until it hits 60°, then the fans kick in for 10 sec until it's down to 50. Not that the Red Devil is too loud, but the knowledge about this alone is kinda annoying ^^

    Comment


    • #3
      Originally posted by Termy View Post
      I'm still hoping someday they manage to have multi monitors without clocking the vram to the highest level. On my 5700xt that yields rising temperatures until it hits 60°, then the fans kick in for 10 sec until it's down to 50. Not that the Red Devil is too loud, but the knowledge about this alone is kinda annoying ^^
      I believe it isn't that easy. Coincidentally, today I down-clocked RX 570 to 300/300 MHz shader/memory clocks while playing an 1080p/60Hz Youtube video in Chrome and the GPU usage was about 60%. I suppose with 1440p/60Hz video the video player would be forced to start skipping video frames. Unfortunately, Chrome is unable to take advantage of the GPU's UVD (unified/universal video decoder) ASIC.

      Not every Linux user is checking whether the GPU utilization is too high before playing a Youtube video on a GPU in powersave mode.

      Note: It is impossible to have dynamic memory clock selection enabled (on RX 570 and previous GPU's) when having multiple monitors connected because it causes screen flickering (with amdgpu.ppfeaturemask=0xffffffff). So, the only option is to select either 300 MHz clock or the 1750 MHz clock on RX 570, but if 300 MHz is selected the video player might start skipping frames when playing high-resolution videos.
      Last edited by atomsymbol; 02-03-2020, 04:29 AM. Reason: Add note

      Comment


      • #4
        BACO (Bus Active, Chip Off)
        vhat about
        CABO (Chip Active, Bus Off)
        ?

        Comment


        • #5
          Originally posted by atomsymbol View Post

          Note: It is impossible to have dynamic memory clock selection enabled (on RX 570 and previous GPU's) when having multiple monitors connected because it causes screen flickering (with amdgpu.ppfeaturemask=0xffffffff). So, the only option is to select either 300 MHz clock or the 1750 MHz clock on RX 570, but if 300 MHz is selected the video player might start skipping frames when playing high-resolution videos.
          yeah, if i understand correctly that has something to do with synchronizing the switching with the refresh - but i'm not nearly knowledgeable enough to judge wheather that is hard to implement or not ^^

          Comment


          • #6
            Don't get your hopes up, they screwed up single display 1440p 75Hz VRAM downclocking with Navi even on Windows.
            Last edited by aufkrawall; 02-03-2020, 07:25 AM. Reason: questionable idiom

            Comment


            • #7
              Originally posted by atomsymbol View Post
              Note: It is impossible to have dynamic memory clock selection enabled (on RX 570 and previous GPU's) when having multiple monitors connected because it causes screen flickering (with amdgpu.ppfeaturemask=0xffffffff). So, the only option is to select either 300 MHz clock or the 1750 MHz clock on RX 570, but if 300 MHz is selected the video player might start skipping frames when playing high-resolution videos.
              Dammit if that doesn't explain my situation yesterday. Took an old 1440x900 20" monitor and plugged it into my RX 580 and my displays went wonky with green lines, refresh rate issues, and KWin's compositor would crash. Running either alone and it was just fine.

              That's when I ran Manjaro's 19.0 Gnome testing and it worked perfectly, ditto with their 19.0 XFCE and Plasma ISOs and that left me thinking that maybe it was related to Linux 5.4 since those ISOs run 5.5. If it's because of ppfeaturemask, and I'm disabling it and rebooting to check after posting, I'm gonna be kind of pissed because I use ppfeaturemask to undervolt my GPU to counter it running hot and thermal throttling.

              EDIT: Yep, disabiling ppfeaturemask fixed it. Well fsck me running. That's not a compromise I'm willing to accept. My MSI GPU runs too damn hot under gaming loads without ppfeaturemask and undervolting it.
              Last edited by skeevy420; 02-03-2020, 10:18 AM.

              Comment


              • #8
                Originally posted by skeevy420 View Post
                Dammit if that doesn't explain my situation yesterday. Took an old 1440x900 20" monitor and plugged it into my RX 580 and my displays went wonky with green lines, refresh rate issues, and KWin's compositor would crash. Running either alone and it was just fine.

                That's when I ran Manjaro's 19.0 Gnome testing and it worked perfectly, ditto with their 19.0 XFCE and Plasma ISOs and that left me thinking that maybe it was related to Linux 5.4 since those ISOs run 5.5. If it's because of ppfeaturemask, and I'm disabling it and rebooting to check after posting, I'm gonna be kind of pissed because I use ppfeaturemask to undervolt my GPU to counter it running hot and thermal throttling.
                I have a small script that is started from an udev rule when a monitor is plugged-in or unplugged. The script will set GPU's memory clocks to a single value if two monitors are connected to prevent flickering related to ppfeaturemask, and set the memory clocks to normal if only one monitor remains connected.

                Comment


                • #9
                  Originally posted by skeevy420 View Post
                  My MSI GPU runs too damn hot under gaming loads without ppfeaturemask and undervolting it.
                  I have a negative experience with undervolting, if the GPU is overclocked at the same time, causing weird triangles and invalid colors to be rendered on screen in Shadow of the Tomb Raider for example.

                  Comment


                  • #10
                    Did you try amdgpu.ppfeaturemask=0xfffd7fff to just unlock overdrive?

                    Comment

                    Working...
                    X