Announcement

Collapse
No announcement yet.

Radeon DRM GPU Driver Performance: Linux 3.4 To 3.8

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Rexilion View Post
    To the fullest extend, since Unity was used like it's used to all the way through the entire benchmark? (duh...)
    So you know how exactly the problems with Unity are caused and that the version (and therefore functionality and different state of bugs) of the driver has no influence on the performance regressions of Unity?
    If so, please share your insight into that with the Unity developers, so that it can be fixed.

    Comment


    • #12
      Originally posted by Vim_User View Post
      So you know how exactly the problems with Unity are caused and that the version (and therefore functionality and different state of bugs) of the driver has no influence on the performance regressions of Unity?
      If so, please share your insight into that with the Unity developers, so that it can be fixed.
      No I don't. Never shared, never claimed to do so.

      I'm implying that the use of Unity is not an issue since it's consistently used during all the benchmarks. So the (relative) numbers are accurate. If a benchmark is faster with Unity, it will highly likely also be without it. I'm was stating that argueing about is kind of nonsense. Unity is not the only factor influencing video performance.

      Furthermore, if it would then that would indicate a buggy driver and that the hypothetical inconsistent results cannot be blamed by wrongdoing of Unity.

      Comment


      • #13
        Originally posted by Rexilion View Post
        If a benchmark is faster with Unity, it will highly likely also be without it.
        Sorry, I prefer accurate benchmarks, not benchmarks that are likely to be accurate.

        Unity is not the only factor influencing video performance.
        The point is not that Unity influences the benchmark. Any WM will have some influnce on a graphics benchmark. The point is that Unity influences the benchmark in an unpredictable way, leading to results that simply may or may not be accurate, making the whole benchmark useless.

        Furthermore, if it would then that would indicate a buggy driver and that the hypothetical inconsistent results cannot be blamed by wrongdoing of Unity.
        You are aware of the fact that things change in drivers beyond just bug fixing? Things that may have an impact on Unity's performance regressions?

        Comment


        • #14
          Originally posted by Vim_User View Post
          Sorry, I prefer accurate benchmarks, not benchmarks that are likely to be accurate.
          Then you are at the wrong place. Phoronix tests distributions as is. Not some highly tweaked performant exotic configuration. That much less practical use for viewing performance. These include 'wildcards' as Unity. That perfectly fine decision.

          Yes, we could just test the game in an empty X server with all other TTY's disabled. But who plays games like that?

          Originally posted by Vim_User View Post
          The point is not that Unity influences the benchmark. Any WM will have some influnce on a graphics benchmark. The point is that Unity influences the benchmark in an unpredictable way, leading to results that simply may or may not be accurate, making the whole benchmark useless.
          Not really, it indicates game performance when using Unity. For that point it is valid. If you want benchmarks without Unity, why not benchmark yourself? Btw, we all know that Unity and some other compositing window managers are negatively affecting game performance. Which is kind of a shame since on the other side people are doing their utmost best to fix driver issue's only to be negated by compositing window managers.

          Originally posted by Vim_User View Post
          You are aware of the fact that things change in drivers beyond just bug fixing? Things that may have an impact on Unity's performance regressions?
          You might want to (re?)read post #10 of this very same thread.

          Comment


          • #15
            Originally posted by Rexilion View Post
            Then you are at the wrong place. Phoronix tests distributions as is. Not some highly tweaked performant exotic configuration.
            Unity is pretty exotic to me. It's not a WM tuned for high performance graphics and gaming. It's a good WM for everything else though.

            Comment


            • #16
              Originally posted by marek View Post
              Unity is pretty exotic to me. It's not a WM tuned for high performance graphics and gaming. It's a good WM for everything else though.
              I never used it too. But it's the default DE for a very commonly used distro. A distro which is now also used by game developers as a 'base' development platform.

              Comment


              • #17
                Originally posted by Rexilion View Post
                Then you are at the wrong place. Phoronix tests distributions as is. Not some highly tweaked performant exotic configuration. That much less practical use for viewing performance. These include 'wildcards' as Unity. That perfectly fine decision.

                Yes, we could just test the game in an empty X server with all other TTY's disabled. But who plays games like that?



                Not really, it indicates game performance when using Unity. For that point it is valid. If you want benchmarks without Unity, why not benchmark yourself? Btw, we all know that Unity and some other compositing window managers are negatively affecting game performance. Which is kind of a shame since on the other side people are doing their utmost best to fix driver issue's only to be negated by compositing window managers.



                You might want to (re?)read post #10 of this very same thread.
                The title of this article is "Radeon DRM GPU Driver Performance: Linux 3.4 to 3.8". It is meant to give an overview of performance improvements/regression between the different driver versions. How can this be of any use when a WM is used that possibly distorts the results, but nobody knows when and how this happens? When somebody wants to benchmark in regard to a specific topic, here driver versions, he has to make sure that all other things that influence the benchmark keep exactly the same, but you can't do that when using Unity.
                Therefore my conclusion that this benchmark is useless.

                Comment


                • #18
                  Originally posted by Vim_User View Post
                  The title of this article is "Radeon DRM GPU Driver Performance: Linux 3.4 to 3.8". It is meant to give an overview of performance improvements/regression between the different driver versions. How can this be of any use when a WM is used that possibly distorts the results, but nobody knows when and how this happens? When somebody wants to benchmark in regard to a specific topic, here driver versions, he has to make sure that all other things that influence the benchmark keep exactly the same, but you can't do that when using Unity.
                  Therefore my conclusion that this benchmark is useless.
                  You are attributing magical abilities to Unity. How can a program, that is bit by bit the same each benchmark, 'randomly' influence a benchmark that uses different kernels(!).

                  I know what you are getting at. I understand that Unity is kind of a wilcard. But it does not make the entire benchmark useless.

                  Comment


                  • #19
                    Originally posted by Rexilion View Post
                    You are attributing magical abilities to Unity. How can a program, that is bit by bit the same each benchmark, 'randomly' influence a benchmark that uses different kernels(!).
                    Because the different kernels use different driver versions it is possible that the functions Unity uses act different on different kernels, possibly triggering the performance regression of Unity, possibly not.
                    So if we see a performance gain on a newer kernel the question is: Is it caused by driver improvements? Is it caused by a rewritten function that now does not trigger the Unity regression? Or not as often as on an older driver?

                    All this can simply be avoided by using a WM without those performance problems.

                    Comment


                    • #20
                      Originally posted by Vim_User View Post
                      Because the different kernels use different driver versions it is possible that the functions Unity uses act different on different kernels, possibly triggering the performance regression of Unity, possibly not.
                      So if we see a performance gain on a newer kernel the question is: Is it caused by driver improvements? Is it caused by a rewritten function that now does not trigger the Unity regression? Or not as often as on an older driver?

                      All this can simply be avoided by using a WM without those performance problems.
                      Ah, so there is my misunderstanding in your argument. You would want to upgrade the kernel ceterus paribus radeon advertises mesa 2.1.

                      Then you only measure true performance. But, in my opinion, good performance without features is no 'performance' at all if it's a dealbreaker for other 3D apps. And I guess that OpenGL 3.0 is going to pretty much baseline in the future...

                      Comment

                      Working...
                      X