Announcement

Collapse
No announcement yet.

AMD Quietly Working On New Linux GPU Driver Support Block By Block

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • AMD Quietly Working On New Linux GPU Driver Support Block By Block

    Phoronix: AMD Quietly Working On New Linux GPU Driver Support Block By Block

    AMD's Linux graphics driver engineers have been working on the driver support for new graphics processors and now the patches are at the earliest stages of publishing. However, due to driver handling changes, it's sharply different this time around where in the past they volleyed a big set of patches under some colorful fishy codename in an effort to conceal their hardware enablement work...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    anyway this won't stop idiots Linux users buying NVIDIA based laptops (I guess the laptop space is the only place where you can buy a discrete graphic card at a normal price) when there are perfectly fine AMD based gaming/content_creation machine out there (finally! not that many, but they're that good).
    By the way, Hi Nvidia user, please don't feel offended...

    PS: a totally OT rant, sorry
    Last edited by horizonbrave; 18 February 2022, 11:14 AM.

    Comment


    • #3
      Very interesting approach and in retrospect seems like the obvious way to do driver development for unreleased products without giving away too much info about exact configurations and segmentation in your product lineup. Can't give away configuration details in code if they are read from the hardware!

      Comment


      • #4
        Originally posted by zcansi View Post
        Very interesting approach and in retrospect seems like the obvious way to do driver development for unreleased products without giving away too much info about exact configurations and segmentation in your product lineup. Can't give away configuration details in code if they are read from the hardware!
        Not sure if this is the reason. Having multiple configurations of IP blocks combined and giving each a single PCI ID is a huge mess. The new approach will lead to much better/easier code reuse.

        Comment


        • #5
          Originally posted by horizonbrave View Post
          anyway this won't stop idiots Linux users buying NVIDIA based laptops (I guess the laptop space is the only place where you can buy a discrete graphic card at a normal price) when there are perfectly fine AMD based gaming/content_creation machine out there (finally! not that many, but they're that good).
          By the way, Hi Nvidia user, please don't feel offended...

          PS: a totally OT rant, sorry
          How's that CUDA support going? </devils_advocate>

          Comment


          • #6
            Drivers may be in "good" shape in terms of stability and performance for the 6xxx series, but one glaring thing missing is power efficiency improvements like memory down clocking support. Yes, Windows didn't get this support until recently for the 6xxx series, but Linux still lacks this. My 6900 XT is glued to 1000mhz regardless what its doing when the monitor is on. Only dropping when the monitor is turned off.
            edit:
            it doesn't matter. it does not down clock. but works just fine on windows since that one driver a year ago that brought frequency scaling for vram on 6xxx series. including 165hz.
            Last edited by middy; 18 February 2022, 02:40 PM.

            Comment


            • #7
              Originally posted by middy View Post
              Drivers may be in "good" shape in terms of stability and performance for the 6xxx series, but one glaring thing missing is power efficiency improvements like memory down clocking support. Yes, Windows didn't get this support until recently for the 6xxx series, but Linux still lacks this. My 6900 XT is glued to 1000mhz regardless what its doing when the monitor is on. Only dropping when the monitor is turned off.
              VRAM down clocking is there for RDNA1, and I'm quite sure it's there for RDNA2 too (but don't quote me on that). Do you perhaps use a display with a niche refresh rate like 165Hz? Higher refresh rates besides 120 and 144 (maybe 240Hz too) have non-standard timings for the most part and cause drivers to push mem clock to max in order to avoid black screen and/or crash.

              EDIT: Not bound to happen only with 165Hz but any other mode which also happens to have slightly too tight timings.
              Last edited by eyesore; 18 February 2022, 03:00 PM.

              Comment


              • #8
                This just means us older GPU owners, you know most people who can't afford these insane GPU prices, won't be able to use the new drivers.

                Comment


                • #9
                  Originally posted by eyesore View Post

                  VRAM down clocking is there for RDNA1, and I'm quite sure it's there for RDNA2 too (but don't quote me on that). Do you perhaps use a display with a niche refresh rate like 165Hz? Higher refresh rates besides 120 and 144 (maybe 240Hz too) have non-standard timings for the most part and cause drivers to push mem clock to max in order to avoid black screen and/or crash.
                  Much appreciated! I've had the same problem as Middy and your comment reminded me that I set the refresh rate to 144Hz at some point. Turned that down to 60 and power usage went down 30 watts. Awesome! It's idling around 6-7 watts now, instead of the 35-40 that were the norm before.

                  RX6900 XT, btw. So I can confirm it's implemented for RDNA2.

                  Comment


                  • #10
                    Originally posted by Aggedor View Post

                    Much appreciated! I've had the same problem as Middy and your comment reminded me that I set the refresh rate to 144Hz at some point. Turned that down to 60 and power usage went down 30 watts. Awesome! It's idling around 6-7 watts now, instead of the 35-40 that were the norm before.

                    RX6900 XT, btw. So I can confirm it's implemented for RDNA2.
                    This issue is that memory reclocking can only happen during display blanking periods (horizontal or vertical, depending on the asic). If the blanking periods are not long enough, you will see flickering when memory reclocks. You can try manually adjusting the modeline to extend the blanking periods if there is a particular mode that you want to use.

                    Comment

                    Working...
                    X