Page 1 of 4 123 ... LastLast
Results 1 to 10 of 37

Thread: Greater Radeon Gallium3D Shader Optimization Tests

  1. #1
    Join Date
    Jan 2007
    Posts
    14,622

    Default Greater Radeon Gallium3D Shader Optimization Tests

    Phoronix: Greater Radeon Gallium3D Shader Optimization Tests

    After delivering preview benchmarks of the AMD Radeon Gallium3D driver's new shader optimization benchmark, Vadim Girlin, the back-end's author, has shared some complementary Linux OpenGL benchmark results...

    http://www.phoronix.com/vr.php?view=MTM2NzM

  2. #2
    Join Date
    Nov 2009
    Location
    Italy
    Posts
    918

    Default

    So, what does it mean? Did you do something wrong or optimizations apply only to his card?

  3. #3
    Join Date
    Jan 2010
    Posts
    363

    Default

    Pretty sure there's something wrong with Michael's benchmarks, at least for the Unigine and Doom 3 tests. These are x86, and it very much looks like he didn't correctly compile/install the updated Mesa x86 libraries - results are identical. Yet again I have to wonder why Michael didn't investigate this at all.

  4. #4
    Join Date
    Apr 2013
    Posts
    39

    Default

    Quote Originally Posted by brent View Post
    Pretty sure there's something wrong with Michael's benchmarks, at least for the Unigine and Doom 3 tests. These are x86, and it very much looks like he didn't correctly compile/install the updated Mesa x86 libraries - results are identical. Yet again I have to wonder why Michael didn't investigate this at all.
    I think there will be no improvements with r600-sb for Doom 3 anyway, it uses simple shaders where it's hard to find something to optimize, it's like trying to optimize "Hello world" program.

    AFAIK catalyst uses optimizations related to texture formats ("Catalyst AI" or something like this, I didn't look into what exactly it does though), and this option doubled Doom3 performance with fglrx for me last time when I tested, r600g may need the same optimizations.

  5. #5
    Join Date
    Jan 2010
    Posts
    363

    Default

    Hmm, I'm not very familiar with Doom 3, but I was under the impression it used somewhat complex pixel shaders at least. AFAIK it was one of the first games to really push shader hardware, with high quality per-pixel lighting and shadows.

    Can you tell more about the optimizations Catalyst uses? Does it convert to a more favorable texture format under the hood or something like that?

  6. #6
    Join Date
    Apr 2013
    Posts
    39

    Default

    Quote Originally Posted by brent View Post
    Hmm, I'm not very familiar with Doom 3, but I was under the impression it used somewhat complex pixel shaders at least. AFAIK it was one of the first games to really push shader hardware, with high quality per-pixel lighting and shadows.
    Possibly Doom3 shaders were somewhat complex for hardware available in 2004, IIRC it was a time of Radeon X800 series (R4xx chips), but for newer hardware they are pretty trivial as compared e.g. to shaders used by Unigine demos. They are not even written in GLSL, it's ARB programs.

    Quote Originally Posted by brent View Post
    Can you tell more about the optimizations Catalyst uses? Does it convert to a more favorable texture format under the hood or something like that?
    As I said I didn't look into it, but so far I think it does something like that.
    Also there were some shader tweaks that were probably included as app-specific optimizations in the proprietary drivers, e.g. to use gpu math instead of texture lookup to get precomputed values: http://forum.beyond3d.com/showthread.php?t=12732

    And now we have Doom3 sources, so it might be easier and more efficient to optimize the game itself for modern hardware than to optimize the drivers for this game.

  7. #7
    Join Date
    Feb 2009
    Location
    France
    Posts
    35

    Default

    First time since one year I've switched back to the FOSS drivers on my HD 5600 Mobility laptop, I enabled the shader backend optimizations with a script in .kde/env:
    Code:
    $ env|grep R600
    R600_DEBUG=sb
    and WOW! Don't Starve works better than with fglrx (no cloud glitch and perfectly smooth*), Portal works perfectly, and Stacking works BETTER than it used to on Windows when I played it one year ago, all of them are playable in highest settings with FXAA or MSAA with a good framerate.







    Only Surgeon Simulator 2013, a Unity 4-based game (http://www.youtube.com/watch?v=ZO10bAn_8M8) is glitchy, but it's already almost as fast as it was on fglrx (but not as fast as on Windows):



    So amazing job guys! What's missing now is proper power management, but I can live with switching between low and high profiles for a while.


    *: Actually Don't Starve is smoother than with fglrx during the first 3/4-1 hour but then like it happened with fglrx it becomes choppy (usually when my character loses his sanity), and the slowdown/choppiness is much more sensible than with fglrx. A game restart fixes it. BTW it's not a 2D game, it's 3D with a not so low polygon count (it's not just simple sprites) and shader-based effects.
    But overall it feels so much better because it doesn't have those "micro-freezes" fglrx has at all times, it's really perfectly smooth.
    Last edited by Azultra; 05-11-2013 at 11:14 AM.

  8. #8
    Join Date
    Jan 2013
    Posts
    975

    Default

    Quote Originally Posted by Azultra View Post
    First time since one year I've switched back to the FOSS drivers on my HD 5600 Mobility laptop, I enabled the shader backend optimizations with a script in .kde/env:
    Code:
    $ env|grep R600
    R600_DEBUG=sb
    and WOW!

    ....

    So amazing job guys! What's missing now is proper power management, but I can live with switching between low and high profiles for a while.

    *: Actually Don't Starve is smoother than with fglrx during the first 3/4-1 hour but then like it happened with fglrx it becomes choppy (usually when my character loses his sanity), and the slowdown/choppiness is much more sensible than with fglrx. A game restart fixes it. BTW it's not a 2D game, it's 3D with a not so low polygon count (it's not just simple sprites) and shader-based effects.
    But overall it feels so much better because it doesn't have those "micro-freezes" fglrx has at all times, it's really perfectly smooth.
    Paypal: vadimgirlin at gmail dot com
    I will be donating a bit later this month, the guy clearly needs better GPU card xD.

    Vadim's patches are awesome, radeon starts to match and even outperform fglrx !

    The choppy issue you are having can probably be related to memory fragmentation - but within 3D game, its usually handled by game engine logic alone. Maybe Linux version is simply bugged..? Some tests on geforce might help confirm that.

    Thanks for heads up!
    Last edited by brosis; 05-11-2013 at 02:22 PM.

  9. #9
    Join Date
    Apr 2013
    Posts
    39

    Default

    Quote Originally Posted by Azultra View Post
    Only Surgeon Simulator 2013, a Unity 4-based game (http://www.youtube.com/watch?v=ZO10bAn_8M8) is glitchy
    As far as I can see it's a typical z-fighting issue, I can reproduce it with linux demo and I think it's not a bug in r600g, ideally the game developers should take care of that. By the way, it doesn't happen for me with windows version of the demo (and wine).

  10. #10
    Join Date
    Feb 2009
    Location
    France
    Posts
    35

    Default

    Quote Originally Posted by brosis View Post
    Paypal: vadimgirlin at gmail dot com
    I will be donating a bit later this month, the guy clearly needs better GPU card xD.
    Could you point me out where Vadim mentioned his current config?

    Clubbing together so that he could offer himself a good graphics card is the least we could do for his awesome work!


    And out of curiosity could you or Vadim point me out to mails or blog posts that explain a bit how you managed to get such a leap in performance with shaders?


    Anyway I wanted to see the drivers would stand the test of fire, doing some stuff I'd never consider doing with fglrx , i.e Steam Windows games run by Wine. Never done that before, but the Windows version of Steam actually works great with Wine. For the games themselves that's another matter

    Oddworld: Stranger's Wrath was ported on PC with OpenGL, so it could potentially run perfectly on Linux, and it basically works:



    ..but at 2 FPS. Is it falling back to software Mesa because of unsupported extensions?

    Second game was Endless Space, a great 4X based on Unity 3, so it can be run in OpenGL mode with the -force-opengl falg on Windows but Unity 3 seems to do something with OpenGL contexts that is tolerated by WGL but not by GLX, so I couldn't get it to run at all. It launches in the default D3D9 mode but all I get is a black background with weirdly colored stuff in place of nebulae. Other people managed to get it working in D3D9:

    http://appdb.winehq.org/objectManage...sion&iId=26773

    so is it due to the drivers? But anyway the best thing would be if Wine could make it run in OpenGL mode, and the drivers have nothing to do with that.
    Last edited by Azultra; 05-12-2013 at 02:05 PM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •