From the AMD A10-6800K "Richland" APU I've delivered OpenGL Linux benchmarks of the Radeon HD 8670D graphics and also compared the open-source Gallium3D performance to that of Catalyst. Catalyst still reigns supreme, but in this article are some benchmarks showing the performance between Mesa 9.1 and 9.2 Git and also when deploying the experimental R600 SB shader optimization back-end.
The open-source Radeon Gallium3D benchmarks previously run on the A10-6800K at Phoronix have been from Mesa 9.2 Git, so those numbers aren't new, but the Mesa 9.1 numbers provide some reference for those sticking to stable package versions. More interesting though is looking at the R600 shader optimization back-end.
The Radeon Gallium3D shader optimization code has shown success for those games using GLSL and is better than the default shader compiler back-end. AMD though doesn't enable this "SB" back-end by default as they are still committed to their LLVM compiler architecture and it's needed for compute support. Anyhow, it can be easily enabled on Mesa 9.2 via setting the R600_DEBUG=sb environment variable.
For today's testing, the A10-6800K was run from the Linux 3.10 kernel but there will be Linux 3.11 tests once there's an actual release candidate (the dynamic power management code was only merged to mainline yesterday...). The Richland APU was also tested when overclocked to 4.70GHz and was running Ubuntu 13.10 with the Xfce desktop environment.
The Mesa 9.1.4 Git revision tested was cda92f5 and the Mesa Git master code was as of Git revision 9ef49cf.