Announcement

Collapse
No announcement yet.

Greater Radeon Gallium3D Shader Optimization Tests

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Azultra
    replied
    First time since one year I've switched back to the FOSS drivers on my HD 5600 Mobility laptop, I enabled the shader backend optimizations with a script in .kde/env:
    Code:
    $ env|grep R600
    R600_DEBUG=sb
    and WOW! Don't Starve works better than with fglrx (no cloud glitch and perfectly smooth*), Portal works perfectly, and Stacking works BETTER than it used to on Windows when I played it one year ago, all of them are playable in highest settings with FXAA or MSAA with a good framerate.







    Only Surgeon Simulator 2013, a Unity 4-based game (http://www.youtube.com/watch?v=ZO10bAn_8M8) is glitchy, but it's already almost as fast as it was on fglrx (but not as fast as on Windows):



    So amazing job guys! What's missing now is proper power management, but I can live with switching between low and high profiles for a while.


    *: Actually Don't Starve is smoother than with fglrx during the first 3/4-1 hour but then like it happened with fglrx it becomes choppy (usually when my character loses his sanity), and the slowdown/choppiness is much more sensible than with fglrx. A game restart fixes it. BTW it's not a 2D game, it's 3D with a not so low polygon count (it's not just simple sprites) and shader-based effects.
    But overall it feels so much better because it doesn't have those "micro-freezes" fglrx has at all times, it's really perfectly smooth.
    Last edited by Azultra; 11 May 2013, 11:14 AM.

    Leave a comment:


  • vadimg
    replied
    Originally posted by brent View Post
    Hmm, I'm not very familiar with Doom 3, but I was under the impression it used somewhat complex pixel shaders at least. AFAIK it was one of the first games to really push shader hardware, with high quality per-pixel lighting and shadows.
    Possibly Doom3 shaders were somewhat complex for hardware available in 2004, IIRC it was a time of Radeon X800 series (R4xx chips), but for newer hardware they are pretty trivial as compared e.g. to shaders used by Unigine demos. They are not even written in GLSL, it's ARB programs.

    Originally posted by brent View Post
    Can you tell more about the optimizations Catalyst uses? Does it convert to a more favorable texture format under the hood or something like that?
    As I said I didn't look into it, but so far I think it does something like that.
    Also there were some shader tweaks that were probably included as app-specific optimizations in the proprietary drivers, e.g. to use gpu math instead of texture lookup to get precomputed values: http://forum.beyond3d.com/showthread.php?t=12732

    And now we have Doom3 sources, so it might be easier and more efficient to optimize the game itself for modern hardware than to optimize the drivers for this game.

    Leave a comment:


  • brent
    replied
    Hmm, I'm not very familiar with Doom 3, but I was under the impression it used somewhat complex pixel shaders at least. AFAIK it was one of the first games to really push shader hardware, with high quality per-pixel lighting and shadows.

    Can you tell more about the optimizations Catalyst uses? Does it convert to a more favorable texture format under the hood or something like that?

    Leave a comment:


  • vadimg
    replied
    Originally posted by brent View Post
    Pretty sure there's something wrong with Michael's benchmarks, at least for the Unigine and Doom 3 tests. These are x86, and it very much looks like he didn't correctly compile/install the updated Mesa x86 libraries - results are identical. Yet again I have to wonder why Michael didn't investigate this at all.
    I think there will be no improvements with r600-sb for Doom 3 anyway, it uses simple shaders where it's hard to find something to optimize, it's like trying to optimize "Hello world" program.

    AFAIK catalyst uses optimizations related to texture formats ("Catalyst AI" or something like this, I didn't look into what exactly it does though), and this option doubled Doom3 performance with fglrx for me last time when I tested, r600g may need the same optimizations.

    Leave a comment:


  • brent
    replied
    Pretty sure there's something wrong with Michael's benchmarks, at least for the Unigine and Doom 3 tests. These are x86, and it very much looks like he didn't correctly compile/install the updated Mesa x86 libraries - results are identical. Yet again I have to wonder why Michael didn't investigate this at all.

    Leave a comment:


  • darkbasic
    replied
    So, what does it mean? Did you do something wrong or optimizations apply only to his card?

    Leave a comment:


  • phoronix
    started a topic Greater Radeon Gallium3D Shader Optimization Tests

    Greater Radeon Gallium3D Shader Optimization Tests

    Phoronix: Greater Radeon Gallium3D Shader Optimization Tests

    After delivering preview benchmarks of the AMD Radeon Gallium3D driver's new shader optimization benchmark, Vadim Girlin, the back-end's author, has shared some complementary Linux OpenGL benchmark results...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite
Working...
X