The Grinch That Stole The Radeon Gallium3D Performance?

Written by Michael Larabel in Display Drivers on 22 December 2011 at 10:08 AM EST. Page 1 of 6. 19 Comments.

There are some significant performance drops right now on Mesa master for the forthcoming 7.12/8.0 release concerning the Gallium3D driver for older ATI Radeon graphics processors. The performance of the R300g driver is now setback compared to earlier Mesa releases.

This week I was running some new benchmarks of the R300g Radeon Gallium3D driver, which is the open-source user-space driver that handles the ATI/AMD GPUs from the R300 through R500 (Radeon X1000) series. In running benchmarks of the latest Gallium3D driver code and Mesa itself, along with the latest Linux kernel DRM and xf86-video-ati DDX, the OpenGL performance is now degraded compared to Mesa 7.11 and earlier. The expectation was that the performance would be close to (or better) than the earlier 7.11 release since that's what it was when checking last earlier on in the 7.12-devel cycle, but it's a different story now for some OpenGL workloads.

With a Radeon X1950PRO 256MB graphics card I had used the Linux 3.2 development kernel and xf86-video-ati 6.14.99 DDX while testing the latest Mesa 7.12-devel Git (9f8573b) and then switching to the latest branches for 7.11 (7.11.2), 7.10 (7.10.3), 7.9 (7.9.2), and 7.8 (7.8.3) and testing those versions of the R300g driver. After discovering the performance on Mesa 7.12-devel Git had fallen off a cliff in some of the common tests, I then carried out similar tests on a completely different system (a ThinkPad with Mobility Radeon X1400 graphics) to confirm these findings.

Radeon R300g Mesa Comparison

In both systems the SwapbuffersWait was manually disabled and color tiling is enabled by default for the R500 hardware. The first batch of OpenGL benchmarks comparing the Mesa versions while running on the Linux 3.2 kernel DRM are from the Radeon X1950PRO (RV570) system.


Related Articles