1. Computers
  2. Display Drivers
  3. Graphics Cards
  4. Memory
  5. Motherboards
  6. Processors
  7. Software
  8. Storage
  9. Operating Systems


Facebook RSS Twitter Twitter Google Plus


Phoronix Test Suite

OpenBenchmarking.org

Previewing The Radeon Gallium3D Shader Optimizations

Michael Larabel

Published on 7 May 2013
Written by Michael Larabel
Page 1 of 3 - 11 Comments

With the AMD R600 Gallium3D shader optimizing back-end having been merged last week, new benchmarks were carried out at Phoronix to see the impact of the experimental shader optimizations on multiple AMD Radeon HD graphics cards.

The shader optimization back-end has been the work of Vadim Girlin and started out months ago. He's made many significant optimizations that affect the open-source AMD Linux driver's performance. The open-source AMD Linux developers upstream haven't been too particularly excited since they're already planning to eventually use their LLVM GPU back-end rather than this code. However, the optimized shader back-end code was merged last week and can be optionally turned on.

By default the R600 Gallium3D driver isn't taking advantage of Girlin's shader work but needs to be exposed via the "R600_DEBUG=sb" environment variable. There's also the "R600_DEBUG=sbcl" option for optimizing compute shaders rather than just graphics shaders. There's also "R600_DEBUG=sbstat" for dumping optimization statistics of shaders to the output.

While this R600 "SB" work is still actively being debugged and worked on, yesterday evening I began running some tests on different AMD Radeon graphics cards. From an Ubuntu 13.04 system with the Linux 3.9 kernel and Xfce 4.10, Mesa 9.2.0 was pulled from Git on 6 May as of revision c9cf83b (following the most recent "r600/sb" commits). Swap buffers wait was disabled for the xf86-video-ati DDX during testing.

The graphics cards used during testing were the Radeon HD 5830, HD 6570, and HD 6770 for a variety of Linux OpenGL games making use of GLSL. Some other graphics cards were also tempted but a few regressions were noted (it's possible that it's not related to the shader optimizations themselves and just introduced recently in Mesa past the 15-way Linux GPU comparison but time wasn't spent debugging the problems further). E.g:

<< Previous Page
1
Latest Linux Hardware Reviews
  1. Btrfs On 4 x Intel SSDs In RAID 0/1/5/6/10
  2. AMD Radeon R9 290 On Ubuntu 14.10: RadeonSI Gallium3D vs. Catalyst
  3. MSI X99S SLI PLUS On Linux
  4. NVIDIA GeForce GTX 970 Offers Great Linux Performance
Latest Linux Articles
  1. Windows 8.1 vs. Ubuntu 14.10 With Intel HD Graphics
  2. 6-Way Ubuntu 14.10 Radeon Gallium3D vs. Catalyst Driver Comparison
  3. NVIDIA vs. Nouveau Drivers On Ubuntu 14.10
  4. Ubuntu 14.10 Offers AMD Radeon Driver Performance Improvements
Latest Linux News
  1. SIMD For JavaScript Continues Coming Along
  2. GNOME 3.15.1 Released
  3. Red Hat Software Collections 1.2 Adds GCC 4.9, Nginx 1.6
  4. GLAMOR Acceleration Continues To Be Cleaned Up
  5. Russia's Yandex Web Browser Finally Released For Linux
  6. Linux Kernel Finally Being Optimized For SSHDs
  7. GPU Profiling Support Lands In Mozilla Firefox
  8. Kubuntu 15.04 Will Use KDE's Plasma 5 By Default
  9. KDBUS Submitted For Review To The Mainline Linux Kernel
  10. An Intel-Based Ubuntu Touch Tablet Is Planning To Launch Soon
Latest Forum Discussions
  1. Is foolish currently develop in machine code, hexadecimal and assembly?
  2. How to get rid of Linux
  3. Reducing The CPU Usage In Mesa To Improve Performance
  4. Help diagnosing problems with a Readon HD 4670 on Mesa 10.3.2-1
  5. Advertisements On Phoronix
  6. nv and xorg.conf under Debian PPC
  7. Looking for a Open-Source AMD experienced Linux mentor
  8. Bad perfomance in gaming