Announcement

Collapse
No announcement yet.

Radeon R600 Gallium3D NIR Backend Continues Advancing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • bridgman
    replied
    Originally posted by ermo View Post
    I still have a running HD5770 card with a slightly modified aircooling solution (the original fan became noisy) that I use w/Linux.

    One thing I've noticed is that it performed vastly better (2x) on OS X Sierra when I used it in an old hackintosh.

    Does anyone know why the r600g OpenGL implementation is so relatively slow (and apparently buggy as well) compared to other OSes?

    I mean, sure, I know the card is well past its prime, but if it works and it draws relatively little power at idle I see no real reason to replace it...
    My first guess would be different shader compilers - we didn't move to the current LLVM shader compiler as default until GCN, and LLVM wasn't such a good fit for the VLIW hardware.

    You could try running at low and high resolution to see if the performance delta between Linux and OS X changed significantly - if it is the shader compiler then I would expect you to see more of a performance delta at high resolutions.

    There was a back-end optimizer written for the VLIW parts but i don't remember if it was enabled by default - it did improve shader performance quite a bit IIRC.

    Leave a comment:


  • ms178
    replied
    Originally posted by ermo View Post
    I still have a running HD5770 card with a slightly modified aircooling solution (the original fan became noisy) that I use w/Linux.

    One thing I've noticed is that it performed vastly better (2x) on OS X Sierra when I used it in an old hackintosh.

    Does anyone know why the r600g OpenGL implementation is so relatively slow (and apparently buggy as well) compared to other OSes?

    I mean, sure, I know the card is well past its prime, but if it works and it draws relatively little power at idle I see no real reason to replace it...
    Unfortunately that generation of cards is not that well supported on Linux as probably AMD wasn't putting much effort in performance back then. On Linux, there were constant issues with the SB shader optimizer, OpenCL support is still at 1.1 level to this date with Clover and we are missing out on all the AMDGPU, RadeonSI and Vulkan efforts targeting GCN and newer. AMD put much more effort with GCN on Linux as it was the time they wanted to enter the HPC GPU market and needed to invest more in their Linux software stack.

    The OpenGL gaming performance on Windows was better for my 6770M originally, but even on Windows there are now a lot of issues, e.g. for my Intel HD 3000 / 6770M laptop combination you need to disable the ultra low power mode for the AMD dGPU or else you face minute long lags on the desktop - this hurts battery performance of course. And DirectX11 is not usable anymore on with current versions of Windows 10 (possibly due to detection issues as the Intel iGPU doesn't support it). So you either have to use an outdated OS (Windows 7), need to sacrifice DX11 (DX9 was fine) plus some hacks on Windows 10 or you need to sacrifice 3D performance and features on Linux but at least it does work with power savings for laptops. I chose to use Linux on that laptop and relegated it to office use only. As it gets 9 years old next month, I am thankful that it still serves me well.

    Leave a comment:


  • staggerlee
    replied
    Does the NIR Backend need to be switched on or will it be switched on by default? If it needs switched on, how?

    Leave a comment:


  • ermo
    replied
    I still have a running HD5770 card with a slightly modified aircooling solution (the original fan became noisy) that I use w/Linux.

    One thing I've noticed is that it performed vastly better (2x) on OS X Sierra when I used it in an old hackintosh.

    Does anyone know why the r600g OpenGL implementation is so relatively slow (and apparently buggy as well) compared to other OSes?

    I mean, sure, I know the card is well past its prime, but if it works and it draws relatively little power at idle I see no real reason to replace it...
    Last edited by ermo; 20 July 2020, 10:06 AM.

    Leave a comment:


  • Lemmiwinks
    replied
    I still run a Radeon HD 3650 and welcome any improvements for these generations of chips. Unfortunately fan control does not seem to work. The fan is always spinning at some lower level (maybe 40%) so the card is getting really hot under load. Anybody else having this problem? (dpm is enabled...)

    Leave a comment:


  • Radeon R600 Gallium3D NIR Backend Continues Advancing

    Phoronix: Radeon R600 Gallium3D NIR Backend Continues Advancing

    While the open-source Radeon Linux graphics driver allows pre-GCN AMD graphics card owners continue making use of their graphics cards, there are diminishing returns with newer games requiring Vulkan that is not supported by pre-HD7000 series hardware as well as far greater performance and efficiency improvements in the more recent generations. In any case, if you are still using a Radeon HD 2000 through HD 6000 series graphics card, some new life is being pushed into the open-source driver via the in-development NIR back-end...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite
Working...
X