Announcement

Collapse
No announcement yet.

RadeonSI OpenGL Compute Shader Patches Published

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by haagch View Post
    Yea, that's the complete backtrace.
    Try "thread apply all bt" instead, this could be separate rendering thread or corrupted stack.

    Comment


    • #22
      Originally posted by Ancurio View Post

      Robustness is used exclusively for running WebGL, it's irrelevant for anything gaming or desktop related.
      Indeed, and they can't even accidentally use it because it requires explicitly asking for a robust context.

      Robust buffer access can be enabled by creating a context with robust
      access enabled through the window system binding APIs. When enabled, any command unable to generate a GL error as described above, such as buffer object accesses from the active program, will not read or modify memory outside of the data store of the buffer object and will not result in GL interruption or termination.

      Comment


      • #23
        Originally posted by CrystalGamma View Post

        Can you sum up what it actually does? I'm guessing process isolation has always been there, considering the security implications. So what does implementing this extension actually guarantee?
        Seems to add bounds checking on a bunch of different functions throughout the API.

        Replaces undefined behaviour with reads = 0, writes = discarded.

        Comment


        • #24
          Well 50€...
          I first wondered whether to buy from https://www.bundlestars.com/en/bundl...-mordor-bundle but this will no doubt count as a windows purchase...
          Here is currently a sale for the same price: https://www.indiegala.com/store/prod...-edition/51209 and they advertise linux support... So if Feral doesn't get money there's not much more I can do about it.
          Downloading now...

          Comment


          • #25
            Originally posted by haagch View Post
            Well 50€...
            Here is your 90% discount: https://www.g2a.com/middle-earth-sha...-global-1.html
            ## VGA ##
            AMD: X1950XTX, HD3870, HD5870
            Intel: GMA45, HD3000 (Core i5 2500K)

            Comment


            • #26
              So Shadow of Morder.. Simply starting it with OpenGL 4.2 works, but then it looks like this:

              I guess it's still this: https://bugs.freedesktop.org/show_bug.cgi?id=92059

              Running it with MESA_GL_VERSION_OVERRIDE=4.3fc MESA_GLSL_VERSION_OVERRIDE=430fc produces a more complete image. Here is some gameplay:
              video, sharing, camera phone, video phone, free, upload

              I don't know whether everything is rendering or if still e.g. rain is missing, I've only played few minutes.
              To be honest, with that kind of performance (i7 3632qm, HD 7970M) it's not really playable.


              And a bit off topic: Intel's driver is STILL very buggy with PRIME. Here is an example for fun corruption in spotify after playing shadow of mordor with PRIME:
              https://www.youtube.com/watch?v=sIhqLRp5VF8.
              I had to record with a camera, because on a screencast there was no corruption visible. People keep saying intel has the best linux driver, yet bugs like this keep showing up again and again...

              Comment


              • #27
                Originally posted by haagch View Post
                People keep saying intel has the best linux driver, yet bugs like this keep showing up again and again...
                Intel's driver is the buggiest ever, IMHO.
                ## VGA ##
                AMD: X1950XTX, HD3870, HD5870
                Intel: GMA45, HD3000 (Core i5 2500K)

                Comment


                • #28
                  Originally posted by zanny View Post
                  Nobody has GL_ARB_robust_buffer_access_behavior yet, and it seems like it might be a PITA to implement rigorous bounds checking on buffers like that throughout the driver stack. RadeonSI might only need four extensions to reach 4.3, but there is always a reason they aren't already done - they are usually the hardest, or at least most annoying, ones to implement.
                  I think GL_ARB_robust_buffer_access_behavior is one of the easiest, because our hardware does bounds checking automatically. Bounds checking is required so as not to get VM faults in dmesg, which proves that it works.

                  Comment


                  • #29
                    Originally posted by leonmaxx View Post
                    Try "thread apply all bt" instead, this could be separate rendering thread or corrupted stack.
                    Well, gdb does switch to the thread where the segfault happens, so I'm not sure what more there is to see: https://gist.github.com/ChristophHaa...9f6346858fe995. I think gdb would display a warning message when the stack was corrupted...

                    Anyway, I experimented a bit and as far as I can see, leaving all CFLAGS on default (i.e. not setting the variable at all) makes it segfault after the initial game loading screen, but setting the CFLAGS variable to -ggdb makes it work - even when the driver libraries are stripped. Which will make this difficult to debug. Oh well, at least it shouldn't slow down mesa significantly...

                    Originally posted by atomsymbol
                    For the sake of completeness: What's the in-game graphics quality setting?
                    I think I open them in the video at some point, but they are hard to read because of gallium hud.
                    Relatively high settings, I can later try with lowest settings. But I did notice that the GPU usage is relatively high - that is at least good.
                    Some games like Saints Row IV run with very bad performance, while having 15-30% cpu usage, and that shows then that there are huge bottlenecks somewhere.

                    Comment


                    • #30
                      Originally posted by haagch View Post
                      People keep saying intel has the best linux driver, yet bugs like this keep showing up again and again...
                      And what do you think since you also use open AMD drivers?

                      Comment

                      Working...
                      X