Announcement

Collapse
No announcement yet.

Radeon R300g Morphological Anti-Aliasing Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Sidicas View Post
    That's not true at all.. When you're talking 1080p + MSAA on an old card, the only thing you're testing is the memory bandwidth.. The CPU, GPU and GPU drivers are sitting there practically idle waiting for data to flow between the GPU and GPU memory..
    In order for anti-aliasing to work, geometry gets scaled up to double their resolution in GPU memory. An old card, running at 1080p with anti-aliasing is absurd.. It was never done then, and no gamer is going to do it now on an old card because the framerates produced are too low to be playable and it has nothing to do with the GPU chip or the driver.
    What are you on about? Oh really? Then why where these cards tested with games like Quake 4 on ultra with 4xAA and 16xAF @ 2048x1536 and still managed to get playable framerates? http://www.firingsquad.com/hardware/...0xtx/page7.asp You always max out everything you can when benchmarking as you are again maxing out the card and driver's abilities. If the case of this old Radeon hardware we KNOW what the hardware is capable of, so what we're seeing with these tests is what the Gallium3D drivers are capable of making the card do since we are still far off from what the hardware is known to be capable of.

    The vast majority will run the game at their LCD's native resolution with all settings maxed that they can get away with before looking at what they can do with AA settings. your goal with AA is to reduce jagged edges, if you are lowering the resolution and using lower detail levels on the textures then even with AA turned up it's going to look like crap.

    Performance tuning though would be playing with the settings of the usual biggest performance impactors like soft shadows, dynamic lighting and AA settings since AF settings are almost free on GPUs from the last 5-6 years.

    Comment


    • #22
      Originally posted by Kivada View Post
      You always max out everything you can when benchmarking as you are again maxing out the card and driver's abilities.
      Wrong, wrong, wrong. You do that if you're trying to compare architectures, or drivers, etc., and want to max out to the limits of what can be done.

      You don't do that if you are testing out enabling a feature, and the framerates before you start aren't even playable. At that point, you are just running into limitations that no one in real life will ever hit. Or will they? I have no idea, because these tests can't tell me if i'll run into the same problem when i'm running games at a more reasonable framerate or not.

      When non-AA is giving you 7fps, anything with 2-3 is within the margin of error, so you can't even tell if anything is happening or not.

      Comment


      • #23
        Nope, even when you are adding the new feature, you still hit it ard to see how much of an impact it makes at what resolutions people where actually running when the hardware was current and those resolutions where 1600x1200 - 1920x1200 for people with LCDs and 1600x1200 - 2048x1536 for people that still had/have decent CRT monitors.

        Originally posted by smitty3268 View Post
        When non-AA is giving you 7fps, anything with 2-3 is within the margin of error, so you can't even tell if anything is happening or not.
        It's called a regression dumbass, as has been pointed out when you look at the same tests done with previous iterations of the R300 driver.

        If you have issue with the test take it up with Larabel, he's posting crap articles for ad hits again. Install Adblock and call him out for not putting a little more effort into figuring out why the regressions where so bad instead of running an automated test and tossing the results on the site without looking at them and asking himself "WTF happened here?".
        Last edited by Kivada; 17 January 2013, 09:55 AM.

        Comment


        • #24
          I'm not going to argue whether they were using such high resolutions or not--I know they were, or were close, with the right hardware. I will however ask, were they using AA at such high resolutions? Generally speaking, weren't most displays around the 19-22 inch range? Even at 26 inches, I would think the resolution at 2048x1536 would be high enough where AA would be unnecessary in games.

          Comment


          • #25
            Originally posted by Kivada View Post
            Nope, even when you are adding the new feature, you still hit it ard to see how much of an impact it makes at what resolutions people where actually running when the hardware was current and those resolutions where 1600x1200 - 1920x1200 for people with LCDs and 1600x1200 - 2048x1536 for people that still had/have decent CRT monitors.
            No, you don't. That's stupid. It's like the flip side of people arguing that getting 2500fps in glxgears is way better than getting 2200fps. It just doesn't matter at those speeds, and any bottleneck you are hitting most likely won't ever matter to anyone running at a real speed.

            It's called a regression dumbass
            Classy. Yes, I know - the regression is what makes his tests worthless. That still means they are worthless, though, the reason doesn't matter.

            If you have issue with the test take it up with Larabel, he's posting crap articles for ad hits again. Install Adblock and call him out for not putting a little more effort into figuring out why the regressions where so bad instead of running an automated test and tossing the results on the site without looking at them and asking himself "WTF happened here?".
            That's exactly what i did here.
            Last edited by smitty3268; 17 January 2013, 02:57 PM.

            Comment


            • #26
              Originally posted by Nobu View Post
              I'm not going to argue whether they were using such high resolutions or not--I know they were, or were close, with the right hardware. I will however ask, were they using AA at such high resolutions? Generally speaking, weren't most displays around the 19-22 inch range? Even at 26 inches, I would think the resolution at 2048x1536 would be high enough where AA would be unnecessary in games.
              Yeah they used AA at 2048x1536 check the link I posted to a review of the the X1950XTX running Quake 4 as it's the fastest card covered by the R300 series drive and Quake 4 is Linux native.

              Even at 2048x1536 @ 20" you still need AA since it's still only 123DPI which still isn't high enough to make visible pixelation go away, though due to the nature of CRT screens you did get something like a 2xAA at all times.

              On on LCD you have to play the game at the screen's native resolution or at exactly half of the screen's native resolution or else the scaling will make everything look like crap.

              AA will eventually become irreverent whenever the damn display companies start making 300+ DPI screens so something like 3840x2160 @ 15" since at that point the physical pixels are too small to see from a normal viewing distance and thus you won't have jagged objects. This shouldn't be the issue it is since there are phones now that are 1920x1080 @ 5" which is 440 DPI. But even that is nothing compared to the state of the art digital view finder: Silicon Micro Display ST1080 1920x1080 @ 0.74" 2976.9 DPI and no thats not a typo.

              Comment


              • #27
                Originally posted by Kivada View Post
                Yeah they used AA at 2048x1536 check the link I posted to a review of the the X1950XTX running Quake 4 as it's the fastest card covered by the R300 series drive and Quake 4 is Linux native.

                Even at 2048x1536 @ 20" you still need AA since it's still only 123DPI which still isn't high enough to make visible pixelation go away, though due to the nature of CRT screens you did get something like a 2xAA at all times.

                On on LCD you have to play the game at the screen's native resolution or at exactly half of the screen's native resolution or else the scaling will make everything look like crap.

                AA will eventually become irreverent whenever the damn display companies start making 300+ DPI screens so something like 3840x2160 @ 15" since at that point the physical pixels are too small to see from a normal viewing distance and thus you won't have jagged objects. This shouldn't be the issue it is since there are phones now that are 1920x1080 @ 5" which is 440 DPI. But even that is nothing compared to the state of the art digital view finder: Silicon Micro Display ST1080 1920x1080 @ 0.74" 2976.9 DPI and no thats not a typo.
                a fullhd screen that is 3/4in ??? I mean really what kind of application would such a screen be useful for?

                Comment


                • #28
                  Originally posted by duby229 View Post
                  a fullhd screen that is 3/4in ??? I mean really what kind of application would such a screen be useful for?
                  Let me look that for you... http://www.siliconmicrodisplay.com/st1080.html currently they are selling it as a wearable display/augmented reality HUD.

                  But I can see this tech as a good viewfinder in high end cameras as they are now starting to move to digital viewfinders in DSLR cameras.

                  Comment


                  • #29
                    I have an integrated Radeon X300 in a notebook, I couldn't even play Quake 3 at 1024x768 with it so I am not going to be bold enough to test it with MSAA.

                    Even under Windows it was a struggle to get it work with OpenGL as I remember, but I also remember that I played Doom 3 and Quake 4 in it with playable framerates, I am not so sure about the quality.

                    Comment


                    • #30
                      Originally posted by Kivada View Post
                      If the case of this old Radeon hardware we KNOW what the hardware is capable of, so what we're seeing with these tests is what the Gallium3D drivers are capable of making the card do since we are still far off from what the hardware is known to be capable of.
                      What we're seeing is an unexplained regression. I think the kernel commit "drm/radeon: do not move bo to different placement at each cs" causes it, which will be reverted in the mainline soon.

                      Anyway, I just wanna say that the r300g driver is complete. It implements all hardware optimizations we know of. Trust me, there is nothing else to do in that driver to make it better. In my opinion, the only thing which continues to negatively affect performance and is fixable is the GPU memory management in the kernel.

                      Comment

                      Working...
                      X