Announcement

Collapse
No announcement yet.

AMDGPU/RadeonSI Linux 4.10 + Mesa 17.1-dev vs. NVIDIA 378.09 Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    I've stopped carrying about just the performance numbers which is why I will never buy Nvidia hardware again. I just picked up a dirt cheap used R9 285 and it plays all my games fluidly at 1080p, high settings. Good enough for me! It's also damn nice running in Wayland and not having to worry about the Nvidia blob breaking with a kernel update.

    Comment


    • #62
      Originally posted by geearf View Post
      Hmmm, what about on-demand shader compilation?
      That seemed somewhat common with UE4.
      Shader compilation is done explicitly by the programmer.

      Comment


      • #63
        Originally posted by efikkan View Post
        Shader compilation is done explicitly by the programmer.
        Yes, it's not done by magic when there's a rainbow under the golden tree, I get that.

        Comment


        • #64
          Well, all I can say to nVidia fanboys is this: Enjoy your corrupted game graphics... As an an open source amd driver user, mine are flawless.... Thank you AMD!! I mean really what's an extra 20fps when it is absolutely guaranteed to have corruption in it somewhere? Nvidia fanboys, compare the OSS drivers anti-aliasing vs nvidia drivers anti-aliasing. Go ahead I dare you. A hint: It's highly corrupted on nvidia's driver. It's been badly broken for -years-. Since at least geforce 7xxx series

          Comment


          • #65
            Originally posted by geearf View Post
            Is there another con than loading time for that? Maybe hogging resources or something alike?
            Not really.

            I'm not an expert in these game engines, but I think they take a mix-and-match approach to shaders, which can lead to an exponential blowup in the # of possible shaders that have to be compiled and can get out of hand.

            For example, perhaps every single material is associated with some shader code, and the engine provides 128 materials. So every shader then has 128 variants that need to be compiled. And multiply that out by 50 different parameters that can modify each shader and you get millions, which is way too many to compile in a reasonable amount of time. The engine just compiles them on demand, then, based on what the game actually needs to show and not all possible options.

            Of course, there are likely better ways of handling all that. I think it's primarily something the game engines did because it was easy to do that way and they didn't have to architect a better system.

            Comment


            • #66
              Originally posted by smitty3268 View Post

              Not really.

              I'm not an expert in these game engines, but I think they take a mix-and-match approach to shaders, which can lead to an exponential blowup in the # of possible shaders that have to be compiled and can get out of hand.

              For example, perhaps every single material is associated with some shader code, and the engine provides 128 materials. So every shader then has 128 variants that need to be compiled. And multiply that out by 50 different parameters that can modify each shader and you get millions, which is way too many to compile in a reasonable amount of time. The engine just compiles them on demand, then, based on what the game actually needs to show and not all possible options.

              Of course, there are likely better ways of handling all that. I think it's primarily something the game engines did because it was easy to do that way and they didn't have to architect a better system.
              That seems a fair answer, thank you!

              Comment


              • #67
                Originally posted by smitty3268 View Post
                I'm not an expert in these game engines, but I think they take a mix-and-match approach to shaders, which can lead to an exponential blowup in the # of possible shaders that have to be compiled and can get out of hand.
                Why are you trying to explain to us something you clearly don't comprehend yourself?

                Shader cache only speeds up the compilation step of shader loading, it still has to be linked and attached to uniforms, which is also slow. Even the shader cache has a significant latency. So anyone who does this even with a shader cache during rendering will have stutter.

                Originally posted by smitty3268 View Post
                For example, perhaps every single material is associated with some shader code, and the engine provides 128 materials. So every shader then has 128 variants that need to be compiled. And multiply that out by 50 different parameters that can modify each shader and you get millions, which is way too many to compile in a reasonable amount of time. The engine just compiles them on demand, then, based on what the game actually needs to show and not all possible options.
                And there you gave me definite evidence that you don't know anything about what shader programs really is.

                Shader programs are never recompiled based on "parameters", never!
                Shaders have something we call "uniforms", which the developer has to connect to pointers on compile time, but the data can be updated at any time without any recompilation or reactivation of the shader program. E.g. if I have a matrix representing position and orientation for a object, I can update this between draw calls to draw the next one or the same one at a different location.

                Comment


                • #68
                  if it wasn't for gallium9 patches for wine I'd probably use nvidia even though I don't like nvidia as a company. The 270x I own works good enough without any problems. Open source drivers just seem to be more stable on linux than proprietary linux drivers, go figure.

                  Comment


                  • #69
                    Originally posted by efikkan View Post
                    Why are you trying to explain to us something you clearly don't comprehend yourself?

                    Shader cache only speeds up the compilation step of shader loading, it still has to be linked and attached to uniforms, which is also slow. Even the shader cache has a significant latency. So anyone who does this even with a shader cache during rendering will have stutter.
                    No one is claiming it's a magic bullet which will solve all problems. Well, maybe some are, but not me. But even if it only cuts 50% of the time spent, that significantly helps reduce the stutters.

                    And there you gave me definite evidence that you don't know anything about what shader programs really is.

                    Shader programs are never recompiled based on "parameters", never!
                    Shaders have something we call "uniforms", which the developer has to connect to pointers on compile time, but the data can be updated at any time without any recompilation or reactivation of the shader program. E.g. if I have a matrix representing position and orientation for a object, I can update this between draw calls to draw the next one or the same one at a different location.
                    I'm not sure if you're intentionally misunderstanding me, or if i just suck at explaining.

                    I'm not claiming the shader is recompiled just because new parameters are passed in.

                    I'm saying the engine dynamically creates a string to generate glsl, and it results in completely different shaders being created left and right. If they were smart about it, they would start using uniforms and have the shader handle it properly - instead they just generate new glsl code and compile a new shader.

                    If you have experience with Unreal Engine 3 and can better explain what that engine is doing, then by all means we're listening. But so far it seems that it's you who doesn't really understand what's actually happening in these games. And just to clarify - I'm specifically speaking about certain games, like Borderlands. Not how a proper OpenGL game "should" work.
                    Last edited by smitty3268; 04 February 2017, 01:55 PM.

                    Comment

                    Working...
                    X