AMD Radeon HD 8670D: Gallium3D vs. Catalyst
This morning there were the RadeonSI Gallium3D vs. AMD Catalyst Linux benchmarks for the high-end Radeon HD 7850/7950 "Southern Islands" graphics cards. While the new Southern Islands GPUs understandingly have a long way to catch up on their new open-source Linux Gallium3D driver compared to Catalyst, how is the AMD Radeon HD 8670D "Richland" APU performance between the open and closed-source drivers? Here are some benchmarks.
Last week I began published AMD A10-6800K Richland APU benchmarks under Linux after having bought one of these new AMD APUs. I also delivered some early Radeon HD 8670D Linux benchmarks for the integrated graphics that are only a modest upgrade over the previous-generation AMD Trinity APUs. While marketed as having Radeon HD 8670D graphics, the GPU is not from the HD 8000 "Sea Islands" family or even HD 7000 "Southern Islands", but in fact is pre-GCN and relies on the R600 Gallium3D driver.
While using the R600 Gallium3D driver on the open-source side, the graphics performance has tended to be low in the past due to the lack of dynamic power management support for having the Radeon DRM driver re-clock the integrated graphics hardware. That support is set to be merged into the Linux 3.11 kernel. In this article the open-source driver benchmarks are only coming from the Linux 3.10 kernel, but there will be dynamic power management benchmarks for the Richland APU in a Phoronix article in the coming days. Likewise, there will also be HD 8670D benchmarks when testing the R600g experimental SB shader optimizations and other non-default performance optimizations.
With that said, in this article is just a straightforward comparison between the open and closed-source drivers for the AMD A10-6800K with Radeon HD 8670D graphics. The APU, which was overclocked to 4.70GHz, was running an Ubuntu 13.10 development snapshot as of last week. The Catalyst driver was the 13.6 Beta with fglrx 13.10.10 and OpenGL 4.2.12337. The open-source stack was comprised of the Linux 3.10 kernel, xf86-video-ati 7.1.99 Git, LLVM 3.3, and Mesa 9.2.0-devel as of last week. Aside from disabling swap buffers wait, it was a stock performance comparison though separate Phoronix articles will look at dynamic power management, the shader back-end optimizations, and other OpenGL performance tuning options. Xfce 4.10 was the desktop in use on Ubuntu 13.10.