1. Computers
  2. Display Drivers
  3. Graphics Cards
  4. Memory
  5. Motherboards
  6. Processors
  7. Software
  8. Storage
  9. Operating Systems


Facebook RSS Twitter Twitter Google Plus


Phoronix Test Suite

OpenBenchmarking.org

Previewing The Radeon Gallium3D Shader Optimizations

Michael Larabel

Published on 7 May 2013
Written by Michael Larabel
Page 1 of 3 - 11 Comments

With the AMD R600 Gallium3D shader optimizing back-end having been merged last week, new benchmarks were carried out at Phoronix to see the impact of the experimental shader optimizations on multiple AMD Radeon HD graphics cards.

The shader optimization back-end has been the work of Vadim Girlin and started out months ago. He's made many significant optimizations that affect the open-source AMD Linux driver's performance. The open-source AMD Linux developers upstream haven't been too particularly excited since they're already planning to eventually use their LLVM GPU back-end rather than this code. However, the optimized shader back-end code was merged last week and can be optionally turned on.

By default the R600 Gallium3D driver isn't taking advantage of Girlin's shader work but needs to be exposed via the "R600_DEBUG=sb" environment variable. There's also the "R600_DEBUG=sbcl" option for optimizing compute shaders rather than just graphics shaders. There's also "R600_DEBUG=sbstat" for dumping optimization statistics of shaders to the output.

While this R600 "SB" work is still actively being debugged and worked on, yesterday evening I began running some tests on different AMD Radeon graphics cards. From an Ubuntu 13.04 system with the Linux 3.9 kernel and Xfce 4.10, Mesa 9.2.0 was pulled from Git on 6 May as of revision c9cf83b (following the most recent "r600/sb" commits). Swap buffers wait was disabled for the xf86-video-ati DDX during testing.

The graphics cards used during testing were the Radeon HD 5830, HD 6570, and HD 6770 for a variety of Linux OpenGL games making use of GLSL. Some other graphics cards were also tempted but a few regressions were noted (it's possible that it's not related to the shader optimizations themselves and just introduced recently in Mesa past the 15-way Linux GPU comparison but time wasn't spent debugging the problems further). E.g:

<< Previous Page
1
Latest Linux Hardware Reviews
  1. MSI X99S SLI PLUS On Linux
  2. NVIDIA GeForce GTX 970 Offers Great Linux Performance
  3. CompuLab Intense-PC2: An Excellent, Fanless, Mini PC Powered By Intel's i7 Haswell
  4. From The Atom 330 To Haswell ULT: Intel Linux Performance Benchmarks
Latest Linux Articles
  1. Open-Source Radeon 2D Performance Is Better With Ubuntu 14.10
  2. RunAbove: A POWER8 Compute Cloud With Offerings Up To 176 Threads
  3. 6-Way Ubuntu 14.10 Linux Desktop Benchmarks
  4. Ubuntu 14.10 XMir System Compositor Benchmarks
Latest Linux News
  1. Dead Island GOTY Now Available On Linux/SteamOS
  2. Ubuntu 14.04 In The Power8 Cloud From RunAbove
  3. KDE With Theoretical Client-Side Decorations, Windows 10 Influence
  4. Sandusky Lee: Great Cabinets For Storing All Your Computer Gear
  5. Fedora 21 Beta & Final Release Slip Further
  6. Mesa 10.3.2 Has A Couple Bug-Fixes
  7. RadeonSI/R600g HyperZ Support Gets Turned Back On
  8. openSUSE Factory & Tumbleweed Are Merging
  9. More Fedora Delays: Fedora 21 Beta Slips
  10. Mono Brings C# To The Unreal Engine 4
Latest Forum Discussions
  1. Updated and Optimized Ubuntu Free Graphics Drivers
  2. HOPE: The Ease Of Python With The Speed Of C++
  3. Use Ubuntu MATE 14.10 Make it an official distro.
  4. Users/Developers Threatening Fork Of Debian GNU/Linux
  5. Debian Is Back To Discussing Init Systems, Freedom of Choice
  6. AMD Radeon VDPAU Video Performance With Gallium3D
  7. Ubuntu 16.04 Might Be The Distribution's Last 32-Bit Release
  8. Linux hacker compares Solaris kernel code: