1. Computers
  2. Display Drivers
  3. Graphics Cards
  4. Memory
  5. Motherboards
  6. Processors
  7. Software
  8. Storage
  9. Operating Systems


Facebook RSS Twitter Twitter Google Plus


Phoronix Test Suite

OpenBenchmarking.org

Previewing The Radeon Gallium3D Shader Optimizations

Michael Larabel

Published on 7 May 2013
Written by Michael Larabel
Page 1 of 3 - 11 Comments

With the AMD R600 Gallium3D shader optimizing back-end having been merged last week, new benchmarks were carried out at Phoronix to see the impact of the experimental shader optimizations on multiple AMD Radeon HD graphics cards.

The shader optimization back-end has been the work of Vadim Girlin and started out months ago. He's made many significant optimizations that affect the open-source AMD Linux driver's performance. The open-source AMD Linux developers upstream haven't been too particularly excited since they're already planning to eventually use their LLVM GPU back-end rather than this code. However, the optimized shader back-end code was merged last week and can be optionally turned on.

By default the R600 Gallium3D driver isn't taking advantage of Girlin's shader work but needs to be exposed via the "R600_DEBUG=sb" environment variable. There's also the "R600_DEBUG=sbcl" option for optimizing compute shaders rather than just graphics shaders. There's also "R600_DEBUG=sbstat" for dumping optimization statistics of shaders to the output.

While this R600 "SB" work is still actively being debugged and worked on, yesterday evening I began running some tests on different AMD Radeon graphics cards. From an Ubuntu 13.04 system with the Linux 3.9 kernel and Xfce 4.10, Mesa 9.2.0 was pulled from Git on 6 May as of revision c9cf83b (following the most recent "r600/sb" commits). Swap buffers wait was disabled for the xf86-video-ati DDX during testing.

The graphics cards used during testing were the Radeon HD 5830, HD 6570, and HD 6770 for a variety of Linux OpenGL games making use of GLSL. Some other graphics cards were also tempted but a few regressions were noted (it's possible that it's not related to the shader optimizations themselves and just introduced recently in Mesa past the 15-way Linux GPU comparison but time wasn't spent debugging the problems further). E.g:

<< Previous Page
1
Latest Linux Hardware Reviews
  1. Even With Re-Clocking, Nouveau Remains Behind NVIDIA's Proprietary Linux Driver
  2. The Power Consumption & Efficiency Of Open-Source GPU Drivers
  3. AMD R600g/RadeonSI Performance On Linux 3.16 With Mesa 10.3-devel
  4. Intel Pentium G3258 On Linux
Latest Linux Articles
  1. Updated Source Engine Benchmarks On The Latest AMD/NVIDIA Linux Drivers
  2. Nouveau vs. Radeon vs. Intel Tests On Linux 3.16, Mesa 10.3-devel
  3. KVM Benchmarks On Ubuntu 14.10
  4. X.Org Server 1.16 Officially Released With Terrific Features
Latest Linux News
  1. GNOME/GTK On Wayland Gains Focus At GUADEC
  2. GNOME Stakeholders Take Issue With Groupon Over their Gnome
  3. GStreamer VA-API Plug-In Update Adds New Features
  4. Qt 5.4 Going Into Feature Freeze Next Week With Exciting Changes
  5. OpenSUSE Factory Turns Into Rolling Release Distribution
  6. "The World's Most Highly-Assured OS" Kernel Open-Sourced
  7. NVIDIA Is Working Towards VDPAU H.265/HEVC Support
  8. Hawaii Bug-Fixes Start Hitting Mainline RadeonSI Gallium3D
  9. The FFmpeg vs. Libav War Continues In Debian Land
  10. Grand Theft Auto Running On Direct3D Natively On Linux Shows Gallium3D Potential
Latest Forum Discussions
  1. Grand Theft Auto Running On Direct3D Natively On Linux Shows Gallium3D Potential
  2. Linus Torvalds On GCC 4.9: Pure & Utter Crap
  3. AMD Athlon 5350 APU On Linux
  4. Updated and Optimized Ubuntu Free Graphics Drivers
  5. Debian + radeonsi
  6. Open-source drivers on ATI R7 260X
  7. List of Linux friendly Kickstarter projects
  8. Porting Mesa to the Playstation 2