R600g/RadeonSI Gallium3D Performance At The End Of 2017
One of the common test requests to come in for our end-of-year benchmarking has been a fresh look at the Radeon GPU performance incorporating some both old and new GPUs to see the current state of the open-source driver stack. Tests were done from a Radeon HD 5830 on the Radeon+R600g driver stack to the RX Vega 64 on AMDGPU+RadeonSI, while using the Linux 4.15-rc5 kernel paired with Mesa 17.4-dev.
On a range of graphics cards available from the Radeon HD 5800 series through Radeon RX Vega, benchmarks were done from Ubuntu 16.04.3 LTS with Linux 4.15-rc5 and Mesa 17.4-dev built against LLVM 6.0 SVN via the Padoka PPA.
Radeon HD 5830
Radeon HD 6870
Radeon HD 6950
Radeon HD 7950
Radeon R7 260X
Radeon R9 270X
Radeon R9 290
Radeon RX 550
Radeon RX 560
Radeon RX 580
Radeon R9 Fury
Radeon RX Vega 56
Radeon RX Vega 64
The Radeon HD 5800/6900 series now expose OpenGL 4.3 with the newest R600g Mesa code while the Radeon HD 6800 series is still on OpenGL 3.3 due to being blocked on FP64 support. All RadeonSI cards currently remain on OpenGL 4.5 support until the SPIR-V ingestion support is done for handling OpenGL 4.6. With this testing for GCN 1.0/1.1 graphics cards I also opted to run them using the AMDGPU DRM driver rather than Radeon DRM, given the recent Linux 4.15 AMDGPU vs. Radeon tests showing performance benefits when using the experimental AMDGPU support for Southern Islands and Sea Islands hardware.
In addition to running a range of benchmarks at 2560 x 1440 that could be run on the range of graphics cards benchmarked, the AC system power consumption was also monitored by the Phoronix Test Suite to provide performance-per-Watt metrics.