AMD Gallium3D Shader Optimizations Yield Big Gains
With it looking like the R600 SB shader optimization back-end might be enabled for AMD's popular Radeon Gallium3D graphics driver, here are fresh benchmarks looking at the performance impact of enabling this shader optimization code-path that has the ability to dramatically increase the performance out of GLSL-using OpenGL games.
Vadim Girlin has been the developer for months spearheading the R600 shader optimization back-end and who last month proposed it be enabled by default: R600 SB is only found in Mesa 9.2 code and newer but requires for now setting the R600_DEBUG=sb environment variable. R600 SB can dramatically improve performance and the code was merged in late April.
In this article are new benchmarks for AMD Radeon HD 4000/5000/6000 series graphics cards on the latest Mesa/Gallium3D R600 code as of last week while also running the Linux 3.11 development kernel with Radeon DPM (dynamic power management) enabled. Swap buffers wait was of course disabled and between testing the only change for each of the tested graphics cards was with and without the R600_DEBUG=sb environment variable for toggling this shader optimization back-end.
A range of Linux OpenGL games/benchmarks were used for testing; sadly the Unigine and id Software tests didn't work due to Xubuntu 13.10 fumbling right now with its ia32-libs package. The test bed was an Intel Core i7 4770K "Haswell" system and the range of graphics cards included a Radeon HD 4890, HD 5830, HD 6450, HD 6570, HD 6770, HD 6870, and HD 6950.
If you're curious how the latest open-source Radeon Gallium3D performance compares to the AMD Catalyst Linux driver, see AMD Radeon HD 6000 Series Open-Source Driver Becomes More Competitive and Radeon HD 5000 Series Gallium3D Performance vs. Catalyst.