Announcement

Collapse
No announcement yet.

Radeon Gallium3D Moved Closer To Performance Parity With AMD's Catalyst In 2014

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by duby229 View Post
    Please try to explain one more time what to point is to optimize for hundreds of frames when nobody will ever use them.
    Several reasons:
    1. It shows when the GPU is not running at it's peak potential or efficiency. When you don't put a frame cap on, you get to exploit the weaknesses of the GPU.
    2. Assuming the performance scales with more demanding games, then you could end up getting well below 60FPS in games where you otherwise might want the extra performance. Same goes for those who have weaker GPUs.
    3. Some people have monitors that run at 120Hz or higher.


    One thing I don't quite understand is you ridicule this website for showing things over 60FPS (which is done for a good reason) yet I can't think of a single Windows hardware review site that puts a frame-cap on their tests. I understand the desire for wanting more "relevant" games - I see no problem in that. But like I said before, reviewing new titles to see how they play is not the same thing as reviewing drivers.

    Think of it like this - what you want is to taste different types of pies and see which pie tastes the best so you know which brand (AMD/nvidia) to go for and which chef (GPU core) you think is the best. What Michael is doing is looking at 2 different recipes (drivers) of apple pies and is determining which recipe is better. Sure, both pies are going to taste fine to most people, but maybe one bakes faster or is healthier to eat. It doesn't matter to everyone, but it matters to some, and it is important.

    That's not to say that what you want is wrong, irrelevant, or stupid, just not what these articles are all about.
    Last edited by schmidtbag; 30 December 2014, 12:57 AM.

    Comment


    • #32
      Originally posted by schmidtbag View Post
      Several reasons:
      1. It shows when the GPU is not running at it's peak potential or efficiency. When you don't put a frame cap on, you get to exploit the weaknesses of the GPU.
      2. Assuming the performance scales with more demanding games, then you could end up getting well below 60FPS in games where you otherwise might want the extra performance. Same goes for those who have weaker GPUs.
      3. Some people have monitors that run at 120Hz or higher.
      Sorry for breaking up your post, I just wanted to reply to each point.

      It's not at all about a frame cap. Instead it's about adjusting resolution and quality settings to get the best compromise between framerate and quality settings.The fact is that GPUs do alot more than just spit out frames. When you get to high framerates the GPU is not the bottleneck. (if it was it wouldn't be running at such high framerates) It does NOT show the peak potential or the efficiency of the GPU. At those framerates the bottleneck is almost always the framebuffer or some code running on the CPU. Extrapolating for lower end hardware or for higher quality settings don't work because in those cases the bottleneck is going to be something else entirely.

      One thing I don't quite understand is you ridicule this website for showing things over 60FPS (which is done for a good reason) yet I can't think of a single Windows hardware review site that puts a frame-cap on their tests. I understand the desire for wanting more "relevant" games - I see no problem in that. But like I said before, reviewing new titles to see how they play is not the same thing as reviewing drivers.

      Think of it like this - what you want is to taste different types of pies and see which pie tastes the best so you know which brand (AMD/nvidia) to go for and which chef (GPU core) you think is the best. What Michael is doing is looking at 2 different recipes (drivers) of apple pies and is determining which recipe is better. Sure, both pies are going to taste fine to most people, but maybe one bakes faster or is healthier to eat. It doesn't matter to everyone, but it matters to some, and it is important.

      That's not to say that what you want is wrong, irrelevant, or stupid, just not what these articles are all about.
      You do have a good point about 120hz and higher monitors. Your pie analogy makes no sense, I guess I just am not connecting the dots between the analogy and reality.

      A website that does real good game benches for gamers (on windows) is HardOCP.
      Last edited by duby229; 30 December 2014, 01:35 AM.

      Comment


      • #33
        As far as which games get benchmarked... Even that. Lets face it, Xonotic just isn't that demanding. It just doesnt implement that much functionality, so if that is you reason to try to use it as a driver test or whatever, it just isn't good for that.

        I'm not bashing OSS games, they are fun to kill time on. But as a way to stress the drivers, they can't do that. And as a way to stress hardware, they can't do that either.
        Last edited by duby229; 30 December 2014, 01:48 AM.

        Comment


        • #34
          Originally posted by duby229 View Post
          As far as which games get benchmarked... Even that. Lets face it, Xonotic just isn't that demanding. It just doesnt implement that much functionality, so if that is you reason to try to use it as a driver test or whatever, it just isn't good for that.

          I'm not bashing OSS games, they are fun to kill time on. But as a way to stress the drivers, they can't do that. And as a way to stress hardware, they can't do that either.
          Xonotic is fine; Unigine is fine; everything is perfectly fine. These benchmarks are not to demonstrate performance in X and Y games that X and Y PC gamers on Windows plays. You miss the point of Phoronix entirely. You can cry that FOSS FPS games don't stress GL functionality that much, yet you completely ignore the fact that Unigine benchmarks do stress modern GL functionality from GL2 to GL4. FOSS drivers don't support GL 4.x yet so wanting those demanding GL4 games is pointless anyway. You can also cry that framerates higher than 100 are not useful, but in all reality it's still very useful. Framerate latency is more important than framerate.

          Comment


          • #35
            Originally posted by mmstick View Post
            Xonotic is fine; Unigine is fine; everything is perfectly fine. These benchmarks are not to demonstrate performance in X and Y games that X and Y PC gamers on Windows plays. You miss the point of Phoronix entirely. You can cry that FOSS FPS games don't stress GL functionality that much, yet you completely ignore the fact that Unigine benchmarks do stress modern GL functionality from GL2 to GL4. FOSS drivers don't support GL 4.x yet so wanting those demanding GL4 games is pointless anyway. You can also cry that framerates higher than 100 are not useful, but in all reality it's still very useful. Framerate latency is more important than framerate.
            And you completely skipped the fact that I never said anything about unigine or windows games. vI do like unigine and appreciate it's inclusion on the occasions that it is included. And there are plenty of top teir linux games that do run well on the OSS drivers. They aren't as bad as you make them out to be.

            Must be nice to be able to put words in other peoples mouths.

            Frame latency can still be measured without stupidly high framerates.

            Comment


            • #36
              Originally posted by duby229 View Post
              And you completely skipped the fact that I never said anything about unigine or windows games. vI do like unigine and appreciate it's inclusion on the occasions that it is included. And there are plenty of top teir linux games that do run well on the OSS drivers. They aren't as bad as you make them out to be.

              Must be nice to be able to put words in other peoples mouths.

              Frame latency can still be measured without stupidly high framerates.
              Apparently you are ignorant and oblivious of the entire stream of comments left by you and others that are insinuating all the above. There are no benchmarks on this article that have stupidly high framerates. Stupidly high framerates would be Half-Life 1 on Steam which gets over 3000 FPS on my graphics card. None of the benchmarks presented here have framerates that are anywhere near that level of performance. If you want to claim that I'm putting words into other people's mouths, perhaps you can stop people putting words into other people's mouths as well. I never stated that FOSS drivers are 'bad'. I only stated that FOSS drivers do not support OGL 4.x and therefore the demanding games that you are asking for that use 'modern GL functionality' cannot run on FOSS drivers to begin with -- that's a fact. I play Empire: Total War on Ultra at 4K resolution with the FOSS drivers for my 7950. I know full well what they are capable of.

              Comment


              • #37
                Cudos to AMD, radeon and fglrx driver teams for successful year .

                People should not miss one thing - that 14.12 aka Omega driver is used here , which specially fastened up current GCN 1.1+ hardware... that is thing that made last minute difference for fglrx driver this year... especially keep in mind that when you look at fglrx Hawaii R9 290 results .

                Comment


                • #38
                  Originally posted by Michael View Post
                  The tests used were ones that (0. can be automated well), 1. work on both the open and closed Linux drivers and 3. also work fine under Windows under a similar quality and can be widely reproduced.

                  With regard to the frame-rate, I could easily test at 4K as done often in articles or even just 2560x1600, but then I receive complaints that those resolutions are too uncommon and should test at more normal resolutions (e.g. 1080p). When I do tests at multiple resolutions, then I get complaints the articles are too long....
                  For what it's worth, I appreciate your current tests/resolution (as that's the one I play at).
                  You don't run games that I play, but since these days I play mostly through wine and emulators anyway...


                  One thing that would be cool, if possible, would be to add some quality check on the rendering.
                  With your current setup we know how far apart (or not) the drivers are performance wise, but maybe there is a quality cost for that difference. Or maybe fglrx is faster and renders better... but we cannot tell that currently.
                  Doing so would most likely affect the performance tests, so maybe it should be a separate pass?

                  Also, I have a question, since you are using llvm and mesa git, would it not make sense to also use linux-drm-next?

                  Thank you!

                  Comment


                  • #39
                    What I understand is that SI cards have a Constant Engine to get shader parameters (constants and configuration bits) directly into L2 ram when the shader is going to run and need them.

                    Mesa doesn't use the Constant Engine at all at the moment, thus the shader takes longer to load, since the data is not in L2.
                    That probably explains the difference in GPUtest results.

                    Comment


                    • #40
                      This benchmarks won't tell the whole story.

                      I've been testing my 290x with mesa 10.5 git and the experience is much better than fglrx.

                      With Ubuntu 14.10 everything is smooth, vsync works properly, window resizing work fast, chrome and steam client windows don't get shifted to the right (nasty bug). Also source engine games run much better because there's no stutter and they load much faster.

                      Other games for some reason run smoother than fglrx, perhaps because triple buffering are working with opensource and there's no triple buffering with vsyn on fglrx.

                      Ofcourse mesa doesn't put out the same fps, and there's visual issues on some games (Trine 1, 2), but the progess has been amazing.

                      Comment

                      Working...
                      X