Announcement

Collapse
No announcement yet.

Radeon R600 Gallium3D NIR Backend Continues Advancing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Radeon R600 Gallium3D NIR Backend Continues Advancing

    Phoronix: Radeon R600 Gallium3D NIR Backend Continues Advancing

    While the open-source Radeon Linux graphics driver allows pre-GCN AMD graphics card owners continue making use of their graphics cards, there are diminishing returns with newer games requiring Vulkan that is not supported by pre-HD7000 series hardware as well as far greater performance and efficiency improvements in the more recent generations. In any case, if you are still using a Radeon HD 2000 through HD 6000 series graphics card, some new life is being pushed into the open-source driver via the in-development NIR back-end...

    http://www.phoronix.com/scan.php?pag...NIR-More-Fixes

  • #2
    I still run a Radeon HD 3650 and welcome any improvements for these generations of chips. Unfortunately fan control does not seem to work. The fan is always spinning at some lower level (maybe 40%) so the card is getting really hot under load. Anybody else having this problem? (dpm is enabled...)

    Comment


    • #3
      I still have a running HD5770 card with a slightly modified aircooling solution (the original fan became noisy) that I use w/Linux.

      One thing I've noticed is that it performed vastly better (2x) on OS X Sierra when I used it in an old hackintosh.

      Does anyone know why the r600g OpenGL implementation is so relatively slow (and apparently buggy as well) compared to other OSes?

      I mean, sure, I know the card is well past its prime, but if it works and it draws relatively little power at idle I see no real reason to replace it...
      Last edited by ermo; 07-20-2020, 10:06 AM.

      Comment


      • #4
        Does the NIR Backend need to be switched on or will it be switched on by default? If it needs switched on, how?

        Comment


        • #5
          Originally posted by ermo View Post
          I still have a running HD5770 card with a slightly modified aircooling solution (the original fan became noisy) that I use w/Linux.

          One thing I've noticed is that it performed vastly better (2x) on OS X Sierra when I used it in an old hackintosh.

          Does anyone know why the r600g OpenGL implementation is so relatively slow (and apparently buggy as well) compared to other OSes?

          I mean, sure, I know the card is well past its prime, but if it works and it draws relatively little power at idle I see no real reason to replace it...
          Unfortunately that generation of cards is not that well supported on Linux as probably AMD wasn't putting much effort in performance back then. On Linux, there were constant issues with the SB shader optimizer, OpenCL support is still at 1.1 level to this date with Clover and we are missing out on all the AMDGPU, RadeonSI and Vulkan efforts targeting GCN and newer. AMD put much more effort with GCN on Linux as it was the time they wanted to enter the HPC GPU market and needed to invest more in their Linux software stack.

          The OpenGL gaming performance on Windows was better for my 6770M originally, but even on Windows there are now a lot of issues, e.g. for my Intel HD 3000 / 6770M laptop combination you need to disable the ultra low power mode for the AMD dGPU or else you face minute long lags on the desktop - this hurts battery performance of course. And DirectX11 is not usable anymore on with current versions of Windows 10 (possibly due to detection issues as the Intel iGPU doesn't support it). So you either have to use an outdated OS (Windows 7), need to sacrifice DX11 (DX9 was fine) plus some hacks on Windows 10 or you need to sacrifice 3D performance and features on Linux but at least it does work with power savings for laptops. I chose to use Linux on that laptop and relegated it to office use only. As it gets 9 years old next month, I am thankful that it still serves me well.

          Comment


          • #6
            Originally posted by ermo View Post
            I still have a running HD5770 card with a slightly modified aircooling solution (the original fan became noisy) that I use w/Linux.

            One thing I've noticed is that it performed vastly better (2x) on OS X Sierra when I used it in an old hackintosh.

            Does anyone know why the r600g OpenGL implementation is so relatively slow (and apparently buggy as well) compared to other OSes?

            I mean, sure, I know the card is well past its prime, but if it works and it draws relatively little power at idle I see no real reason to replace it...
            My first guess would be different shader compilers - we didn't move to the current LLVM shader compiler as default until GCN, and LLVM wasn't such a good fit for the VLIW hardware.

            You could try running at low and high resolution to see if the performance delta between Linux and OS X changed significantly - if it is the shader compiler then I would expect you to see more of a performance delta at high resolutions.

            There was a back-end optimizer written for the VLIW parts but i don't remember if it was enabled by default - it did improve shader performance quite a bit IIRC.

            Comment


            • #7
              Originally posted by ms178 View Post
              Unfortunately that generation of cards is not that well supported on Linux as probably AMD wasn't putting much effort in performance back then. On Linux, there were constant issues with the SB shader optimizer, OpenCL support is still at 1.1 level to this date with Clover and we are missing out on all the AMDGPU, RadeonSI and Vulkan efforts targeting GCN and newer.
              Ahh, SB - that's the name I was trying to remember. Thanks !

              IIRC back then the main priority from our users was improving functionality, primarily higher GL levels in order to support new games.

              Originally posted by ms178 View Post
              AMD put much more effort with GCN on Linux as it was the time they wanted to enter the HPC GPU market and needed to invest more in their Linux software stack.
              It wasn't "much more effort"** as much as getting agreement to largely combine the open source and closed source driver efforts into a single all-open stack with a couple of optional closed-source components, giving us a big increase in the number people working on the upstream driver code.

              ** although we had been building up the open source team incrementally since 2007 and are still expanding it even today
              Last edited by bridgman; 07-21-2020, 04:57 AM.

              Comment


              • #8
                Originally posted by staggerlee View Post
                Does the NIR Backend need to be switched on or will it be switched on by default? If it needs switched on, how?
                R600_DEBUG=nir Defaulting to the nir backend is really not an option at this time. Some features are missing, some things are buggy, and currently I can only test on a HD5450 card. Dave Airlie contributed some patches to support some specifics of Cayman, but it will not be enabled by above flag.

                Comment


                • #9
                  Originally posted by bridgman View Post
                  There was a back-end optimizer written for the VLIW parts but i don't remember if it was enabled by default - it did improve shader performance quite a bit IIRC.
                  The optimizer *sb* is enabled by default for supported shaders, but it is disabled for tesselation, compute, and any shader that uses image io, or atomics, and its code is in a state that makes it very difficult to fix things, let alone add the missing pieces.

                  Comment


                  • #10
                    Still crawlin' along with a passive modded HD 4650 AGP on an old P4 DAW, which is about a month away from retirement. Browsing overly bloated websites is a nightmare but it works great as a guitar amp sim with guitarix.

                    A few weeks ago installed Ubuntu 20.04 for an elderly person on a E-450 laptop with a 6470m GPU. Seems to be working okay with the VA-API supported Firefox on Wayland.

                    Comment

                    Working...
                    X