Announcement

Collapse
No announcement yet.

AMD Sends Out New Linux Patches As Part Of Their Next-Gen GPU Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • AMD Sends Out New Linux Patches As Part Of Their Next-Gen GPU Support

    Phoronix: AMD Sends Out New Linux Patches As Part Of Their Next-Gen GPU Support

    A few patch series were fired off yesterday for enabling new IP blocks on upcoming Radeon graphics processors...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Hopefully that is a step towards better 0-day support.

    Comment


    • #3
      Do AMD not care about proper hardware cursor gamma with GPU gamma ramps via modern atomic API and not aggressive enough upclocking for partial loads by default with RDNA2? As I still haven't received "fixed" email notifications after having opened these tickets ages ago and numerous users having confirmed them since then.
      agd5f
      Since I've swapped my RX 5700 XT with an RX 6800, some colors are very noticeably off with xf86-video-amdgpu. It is very noticeable with the top blue bar...

      Brief summary of the problem: The RX 6800 doesn't sustain GPU clock in a sufficient manner, causing Hitman 2 with DXVK...


      Would be nice if these were fixed before I might buy an RDNA4 graphics card in 2-4 years when Windows hopefully won't be necessary anymore for anything, as I definitely don't want to experience the AMD Windows driver ever again.

      Comment


      • #4
        Originally posted by aufkrawall View Post
        Do AMD not care about proper hardware cursor gamma with GPU gamma ramps via modern atomic API and not aggressive enough upclocking for partial loads by default with RDNA2? As I still haven't received "fixed" email notifications after having opened these tickets ages ago and numerous users having confirmed them since then.
        agd5f
        Since I've swapped my RX 5700 XT with an RX 6800, some colors are very noticeably off with xf86-video-amdgpu. It is very noticeable with the top blue bar...

        Brief summary of the problem: The RX 6800 doesn't sustain GPU clock in a sufficient manner, causing Hitman 2 with DXVK...


        Would be nice if these were fixed before I might buy an RDNA4 graphics card in 2-4 years when Windows hopefully won't be necessary anymore for anything, as I definitely don't want to experience the AMD Windows driver ever again.
        AFAIR, You own an Intel Rocket Lake CPU + nVidia 3060 GPU, right?
        If so, how is the auto power profile in nVidia's Linux driver with Ampere from Your experience these days?
        Better than the results You saw out of RDNA2 by default on Linux?

        I'm asking because I'm on the verge of buying a refurbished OEM-PC with the following specs:

        - Intel Core i7-11700F
        - nVidia RTX 3060 Ti
        - 16 GB DDR4-3200
        - 1 TB NVMe SSD

        All for the psychologically rather satisfying price of 1111 €.

        I realize the 8 GB vRAM on the 3060 Ti is unfortunately limiting, but other than that seems like a pretty good deal.

        If You could share Your experience with the combination of Rocket Lake + Ampere on Linux, that would be great.

        Thanks in advance!

        Comment


        • #5
          Linuxxx I think Nvidia's power management with Ampere on Linux is fine out of the box. Upclocking seems to be a bit more aggressive than on Windows, which is good, as it's not aggressive enough there with at least my 3060 (at least you can force max clocks, e.g. on game profile basis). Though there is still that big NVDEC caveat, which, as CUDA workload, runs with boost clock all the time.
          So while the clocking behavior generally is fine when it comes to 3D load, that unfortuantely doesn't mean that the desktop would be responsive and fluid. Both Xorg and Wayland experience are deeply flawed when it comes to intermittent stutter at events like popping up menus etc. It's really super ugly, + the other Nvidia specific issues. Some users are ok with that, but I think expectations should be higher.

          Which is why I can't recommend Nvidia for Linux desktop/gaming usage. I'd recommend it for Windows as the least evil when good gaming experience is deemed most important. I have no experience with GPU connected to AMD/Intel graphics + Nvidia card used for game rendering. This might work well or not on Linux. But if it's only about Linux gaming, probably just go Radeon-only instead.

          As for intel_pstate=powersave, I think it's hopeless. Tried it a few months back with Rocket Lake, was still causing stutter in browser, in games with fps limiter etc. And ony my Gemini Lake notebook, it still burns more power than schedutil. Imho just set pstate=performance for stationary systems, should be just a few more Watts total system power. If it comes to battery life as for notebooks, I think acpi-cpufreq=conservative still is the best choice when longer battery life is preferred over most snappy performance in most situations (at least non-gaming). I suspect intel_pstate=passive should help acpi-cpufreq vs. intel_pstate=disable, but I'm not aware of test results for this.
          Last edited by aufkrawall; 28 April 2022, 05:46 PM.

          Comment


          • #6
            Originally posted by aufkrawall View Post
            Linuxxx I think Nvidia's power management with Ampere on Linux is fine out of the box. Upclocking seems to be a bit more aggressive than on Windows, which is good, as it's not aggressive enough there with at least my 3060 (at least you can force max clocks, e.g. on game profile basis). Though there is still that big NVDEC caveat, which, as CUDA workload, runs with boost clock all the time.
            So while the clocking behavior generally is fine when it comes to 3D load, that unfortuantely doesn't mean that the desktop would be responsive and fluid. Both Xorg and Wayland experience are deeply flawed when it comes to intermittent stutter at events like popping up menus etc. It's really super ugly, + the other Nvidia specific issues. Some users are ok with that, but I think expectations should be higher.

            Which is why I can't recommend Nvidia for Linux desktop/gaming usage. I'd recommend it for Windows as the least evil when good gaming experience is deemed most important. I have no experience with GPU connected to AMD/Intel graphics + Nvidia card used for game rendering. This might work well or not on Linux. But if it's only about Linux gaming, probably just go Radeon-only instead.

            As for intel_pstate=powersave, I think it's hopeless. Tried it a few months back with Rocket Lake, was still causing stutter in browser, in games with fps limiter etc. And ony my Gemini Lake notebook, it still burns more power than schedutil. Imho just set pstate=performance for stationary systems, should be just a few more Watts total system power. If it comes to battery life as for notebooks, I think acpi-cpufreq=conservative still is the best choice when longer battery life is preferred over most snappy performance in most situations (at least non-gaming). I suspect intel_pstate=passive should help acpi-cpufreq vs. intel_pstate=disable, but I'm not aware of test results for this.
            Thanks for the thorough response!

            I had a look around again and simply can't find a better deal with AMD hardware, therefore I just pulled the trigger.

            Plus I started to toy around with Blender in my spare time, where AMD's RDNA2 GPUs simply are no match for nVidia's Ampere architecture.

            The Intel Rocket Lake CPU on the other hand is going to be an interesting one:
            200+ Watts power draw under full load!

            Naturally I'm still going to stick with intel_cpufreq performance on this one too, but will probably experiment with turning the energy-performance bias to favour power savings all the way.

            And official support for AVX-512 should make this CPU particularly well suited for running RPCS3.

            Exciting times ahead!

            Comment


            • #7
              Does AMD's open source driver support HDR yet?
              Is it possible to send movies with HDR metadata over HDMI or DisplayPort and have the attached monitor or TV display them accordingly?
              Last edited by Danny3; 29 April 2022, 02:18 PM.

              Comment


              • #8
                Originally posted by Danny3 View Post
                Does AMD's open source driver support HDR yet?
                Is it possible to send movies with HDR metadata over HDMI or DisplayPort and have the attached monitor or TV display them accordingly?
                There is no standard KMS HDR API and no support for HDR yet in desktop environments. There are a number of ongoing RFCs and discussions about defining the APIs, etc.

                Comment

                Working...
                X