1. Computers
  2. Display Drivers
  3. Graphics Cards
  4. Memory
  5. Motherboards
  6. Processors
  7. Software
  8. Storage
  9. Operating Systems


Facebook RSS Twitter Twitter Google Plus


Phoronix Test Suite

OpenBenchmarking.org

Previewing The Radeon Gallium3D Shader Optimizations

Michael Larabel

Published on 7 May 2013
Written by Michael Larabel
Page 1 of 3 - 11 Comments

With the AMD R600 Gallium3D shader optimizing back-end having been merged last week, new benchmarks were carried out at Phoronix to see the impact of the experimental shader optimizations on multiple AMD Radeon HD graphics cards.

The shader optimization back-end has been the work of Vadim Girlin and started out months ago. He's made many significant optimizations that affect the open-source AMD Linux driver's performance. The open-source AMD Linux developers upstream haven't been too particularly excited since they're already planning to eventually use their LLVM GPU back-end rather than this code. However, the optimized shader back-end code was merged last week and can be optionally turned on.

By default the R600 Gallium3D driver isn't taking advantage of Girlin's shader work but needs to be exposed via the "R600_DEBUG=sb" environment variable. There's also the "R600_DEBUG=sbcl" option for optimizing compute shaders rather than just graphics shaders. There's also "R600_DEBUG=sbstat" for dumping optimization statistics of shaders to the output.

While this R600 "SB" work is still actively being debugged and worked on, yesterday evening I began running some tests on different AMD Radeon graphics cards. From an Ubuntu 13.04 system with the Linux 3.9 kernel and Xfce 4.10, Mesa 9.2.0 was pulled from Git on 6 May as of revision c9cf83b (following the most recent "r600/sb" commits). Swap buffers wait was disabled for the xf86-video-ati DDX during testing.

The graphics cards used during testing were the Radeon HD 5830, HD 6570, and HD 6770 for a variety of Linux OpenGL games making use of GLSL. Some other graphics cards were also tempted but a few regressions were noted (it's possible that it's not related to the shader optimizations themselves and just introduced recently in Mesa past the 15-way Linux GPU comparison but time wasn't spent debugging the problems further). E.g:

<< Previous Page
1
Latest Linux Hardware Reviews
  1. Scythe Mugen MAX
  2. Intel Core i7 5960X Haswell-E On Linux
  3. Intel 80GB 530 Series M.2 SSD On Linux
  4. With A New Motherboard, The Core i7 5960X Haswell-E Lights Up
Latest Linux Articles
  1. MSAA RadeonSI Gallium3D Performance Preview
  2. Intel Core i7 5960X CPU Core Scaling Under Linux
  3. AMD RadeonSI Gallium3D Performance For 4K Linux Gaming
  4. 9-Way File-System Comparison With A SSD On The Linux 3.17 Kernel
Latest Linux News
  1. Nouveau For Linux 3.18 Gains DP Audio, More Re-Clocking
  2. SUSE Gets Bought Out Again
  3. Enlightenment E19 Officially Released With Its Own Wayland Compositor
  4. OpenMediaVault 1.0 Released As New Linux NAS Alternative
  5. VESA Releases DisplayPort 1.3, Pushes 32.4 Gbits/sec
  6. Opera 25 Beta Has Bookmarks & Linux Support
  7. LLVM Clang Now Builds Even More Debian Packages
  8. Pyston 0.2 Is A Heck Of A Lot Better At Running Python Programs
  9. Linux 3.17-rc5 Kernel Released
  10. FreeBSD 10.1 In Beta Ahead Of Planned Release Next Month
Latest Forum Discussions
  1. New Group Calls For Boycotting Systemd
  2. support for first generation UVD blocks (RV6xx, RS780, RS880 and RV790)
  3. Nvidia joins the ranks of Apple and Microsoft
  4. Hd 6850
  5. nv and xorg.conf under Debian PPC
  6. X.Org Is Looking For Some Female Help
  7. FSF Issues Their Rebuttal To Apple's New iPhone, Watch & Apple Pay
  8. Updated and Optimized Ubuntu Free Graphics Drivers