1. Computers
  2. Display Drivers
  3. Graphics Cards
  4. Memory
  5. Motherboards
  6. Processors
  7. Software
  8. Storage
  9. Operating Systems


Facebook RSS Twitter Twitter Google Plus


Phoronix Test Suite

OpenBenchmarking Benchmarking Platform
Phoromatic Test Orchestration

Previewing The Radeon Gallium3D Shader Optimizations

Michael Larabel

Published on 7 May 2013
Written by Michael Larabel
Page 1 of 3 - 11 Comments

With the AMD R600 Gallium3D shader optimizing back-end having been merged last week, new benchmarks were carried out at Phoronix to see the impact of the experimental shader optimizations on multiple AMD Radeon HD graphics cards.

The shader optimization back-end has been the work of Vadim Girlin and started out months ago. He's made many significant optimizations that affect the open-source AMD Linux driver's performance. The open-source AMD Linux developers upstream haven't been too particularly excited since they're already planning to eventually use their LLVM GPU back-end rather than this code. However, the optimized shader back-end code was merged last week and can be optionally turned on.

By default the R600 Gallium3D driver isn't taking advantage of Girlin's shader work but needs to be exposed via the "R600_DEBUG=sb" environment variable. There's also the "R600_DEBUG=sbcl" option for optimizing compute shaders rather than just graphics shaders. There's also "R600_DEBUG=sbstat" for dumping optimization statistics of shaders to the output.

While this R600 "SB" work is still actively being debugged and worked on, yesterday evening I began running some tests on different AMD Radeon graphics cards. From an Ubuntu 13.04 system with the Linux 3.9 kernel and Xfce 4.10, Mesa 9.2.0 was pulled from Git on 6 May as of revision c9cf83b (following the most recent "r600/sb" commits). Swap buffers wait was disabled for the xf86-video-ati DDX during testing.

The graphics cards used during testing were the Radeon HD 5830, HD 6570, and HD 6770 for a variety of Linux OpenGL games making use of GLSL. Some other graphics cards were also tempted but a few regressions were noted (it's possible that it's not related to the shader optimizations themselves and just introduced recently in Mesa past the 15-way Linux GPU comparison but time wasn't spent debugging the problems further). E.g:

Latest Articles & Reviews
  1. Trying Out The Modern Linux Desktops With 4 Monitors + AMD/NVIDIA Graphics
  2. Turning A Basement Into A Big Linux Server Room
  3. NVIDIA's $1000+ GeForce GTX TITAN X Delivers Maximum Linux Performance
  4. OS X 10.10 vs. Ubuntu 15.04 vs. Fedora 21 Tests: Linux Sweeps The Board
  5. The New Place Where Linux Code Is Constantly Being Benchmarked
  6. 18-GPU NVIDIA/AMD Linux Comparison Of BioShock: Infinite
Latest Linux News
  1. Fedora 22 Alpha Now Available For AArch64 & PowerPC64
  2. Systemd Developers Did NOT Fork The Linux Kernel
  3. PulseAudio 7.0 To Enable LFE Remixing By Default
  4. Features & Changes Coming For Mir 0.13
  5. How Far Valve Has Come: Three Years Ago They Needed OpenGL Linux Help
  6. Audacity 2.1 Improves Noise Reduction, Adds Real-Time Effects Preview
  7. Linux 4.0-rc6 Kernel Released
  8. Automatically Managing The Linux Benchmarks Firing Constantly
  9. The Big Features Of The Linux 4.0 Kernel
  10. Mesa's Android Support Is Currently In Bad Shape
Most Viewed News This Week
  1. Introducing The Library Operating System For Linux
  2. Improved OpenCL Support For Blender's Cycles Renderer
  3. Allwinner Continues Jerking Around The Open-Source Community
  4. Open-Source Driver Fans Will Love NVIDIA's New OpenGL Demo
  5. GNOME 3.16 Released: It's Their Best Release Yet
  6. Systemd Change Allows For Stateless Systems With Tmpfs
  7. Ubuntu 15.04 Final Beta Released
  8. Nuclide: Facebook's New Unified IDE