Announcement

Collapse
No announcement yet.

Running The RadeonSI NIR Back-End With Mesa 19.1 Git

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • phoronix
    started a topic Running The RadeonSI NIR Back-End With Mesa 19.1 Git

    Running The RadeonSI NIR Back-End With Mesa 19.1 Git

    Phoronix: Running The RadeonSI NIR Back-End With Mesa 19.1 Git

    It's been a number of months since last trying the RadeonSI NIR back-end, which is being developed as part of the OpenGL 4.6 SPIR-V bits for this AMD OpenGL driver, but eventually RadeonSI may end up switching from TGSI to NIR by default. Given the time since we last tried it out and the increasing popularity of NIR, this weekend I did some fresh tests of the NIR back-end with a Radeon Vega graphics card.

    http://www.phoronix.com/vr.php?view=27529

  • nuetzel
    replied
    Originally posted by Espionage724 View Post
    Does anyone experience longer shader compile times with NIR? I'm not entirely sure if that's the right description for the problem.

    With RuneScape's NXT (C++) client, if I use NIR, new shaders that need cached pause the rendering noticeably longer (like 3-7 second pauses initially without a cache when teleporting to a new area) than without NIR. But it seems once the shaders are cached, there's no problem. It does this on LLVM 7.0 and Mesa 18.1.3 (or something in the 18 series on Fedora 29), and a git build of Mesa 19 and LLVM 9.0 (che-mesa and che-llvm on F29)
    Yes, it's there, but worked on (Timo). (Have a look into mesa-devel: 'nir - compiler optimization' / slow compile times...)

    Leave a comment:


  • Espionage724
    replied
    Does anyone experience longer shader compile times with NIR? I'm not entirely sure if that's the right description for the problem.

    With RuneScape's NXT (C++) client, if I use NIR, new shaders that need cached pause the rendering noticeably longer (like 3-7 second pauses initially without a cache when teleporting to a new area) than without NIR. But it seems once the shaders are cached, there's no problem. It does this on LLVM 7.0 and Mesa 18.1.3 (or something in the 18 series on Fedora 29), and a git build of Mesa 19 and LLVM 9.0 (che-mesa and che-llvm on F29)

    Leave a comment:


  • nuetzel
    replied
    Originally posted by Strunkenbold View Post
    Can confirm. Those sdma patches immediately freeze the whole system. Even if it's just the desktop.
    It is FIXED with Marek's v2 version (Patchwork Mesa).
    https://lists.freedesktop.org/archiv...ry/215579.html

    Leave a comment:


  • Strunkenbold
    replied
    Can confirm. Those sdma patches immediately freeze the whole system. Even if it's just the desktop.

    Leave a comment:


  • nuetzel
    replied
    Originally posted by nuetzel View Post

    Note:
    It was tested with Vega, here.
    I see one little triangle corruption under UH with Polaris 20 (RX580). - Marek know that.
    The above corruption is FIXED with current Mesa git.
    Yes, this is SOLVED with NIR, even on Polaris. - GREAT.
    (Have to bisect for the right commit.)

    Marek's latest SDMA patches.
    BUT sadly reliable _whole_ system hang under UH and UV with my Polaris 20 (Nitro+) => VM faults (I've to dig.)
    Reverting the SDMA patches (1-4) on Polaris SOLVED this.

    Leave a comment:


  • clapbr
    replied
    Originally posted by cb88 View Post

    You should also log CPU usage during the run...
    this. you don't even know if that testcase is CPU-bottlenecked for you and assumed wrong stuff as usual

    Leave a comment:


  • cb88
    replied
    Originally posted by debianxfce View Post
    I tested NIR with the Superposition 1080p extreme benchmark and RX570:
    Code:
    Original R600_DEBUG=nir
    1867 1878 points
    13.97 14.05 avg fps
    No reason to use R600_DEBUG=nir.
    You should also log CPU usage during the run...

    Leave a comment:


  • debianxfce
    replied
    I tested NIR with the Superposition 1080p extreme benchmark and RX570:
    Code:
    Original                      R600_DEBUG=nir
    1867                           1878 points
     13.97                         14.05 avg fps
    No reason to use R600_DEBUG=nir.

    Leave a comment:


  • Dukenukemx
    replied
    Originally posted by Marc Driftmeyer View Post

    And yet no one cares in the Windows world as games use DirectX and not OpenGL.
    Emulators tend to use OpenGL like CEMU, YUZU, and Citra. Which all run like crap on AMD in Windows due to poor OpenGL drivers. Run these emulators on Linux and the performance is far better.

    Leave a comment:

Working...
X