Announcement

Collapse
No announcement yet.

Radeon R300g Morphological Anti-Aliasing Performance

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • smitty3268
    replied
    Originally posted by Kivada View Post
    You always max out everything you can when benchmarking as you are again maxing out the card and driver's abilities.
    Wrong, wrong, wrong. You do that if you're trying to compare architectures, or drivers, etc., and want to max out to the limits of what can be done.

    You don't do that if you are testing out enabling a feature, and the framerates before you start aren't even playable. At that point, you are just running into limitations that no one in real life will ever hit. Or will they? I have no idea, because these tests can't tell me if i'll run into the same problem when i'm running games at a more reasonable framerate or not.

    When non-AA is giving you 7fps, anything with 2-3 is within the margin of error, so you can't even tell if anything is happening or not.

    Leave a comment:


  • Kivada
    replied
    Originally posted by Sidicas View Post
    That's not true at all.. When you're talking 1080p + MSAA on an old card, the only thing you're testing is the memory bandwidth.. The CPU, GPU and GPU drivers are sitting there practically idle waiting for data to flow between the GPU and GPU memory..
    In order for anti-aliasing to work, geometry gets scaled up to double their resolution in GPU memory. An old card, running at 1080p with anti-aliasing is absurd.. It was never done then, and no gamer is going to do it now on an old card because the framerates produced are too low to be playable and it has nothing to do with the GPU chip or the driver.
    What are you on about? Oh really? Then why where these cards tested with games like Quake 4 on ultra with 4xAA and 16xAF @ 2048x1536 and still managed to get playable framerates? http://www.firingsquad.com/hardware/...0xtx/page7.asp You always max out everything you can when benchmarking as you are again maxing out the card and driver's abilities. If the case of this old Radeon hardware we KNOW what the hardware is capable of, so what we're seeing with these tests is what the Gallium3D drivers are capable of making the card do since we are still far off from what the hardware is known to be capable of.

    The vast majority will run the game at their LCD's native resolution with all settings maxed that they can get away with before looking at what they can do with AA settings. your goal with AA is to reduce jagged edges, if you are lowering the resolution and using lower detail levels on the textures then even with AA turned up it's going to look like crap.

    Performance tuning though would be playing with the settings of the usual biggest performance impactors like soft shadows, dynamic lighting and AA settings since AF settings are almost free on GPUs from the last 5-6 years.

    Leave a comment:


  • Jarrod558
    replied
    He, he, it is time for Michael to do some bisecting and not benchamarking

    Leave a comment:


  • HokTar
    replied
    Originally posted by bridgman View Post
    In fairness, we're not aware of anything wrong with the tests. The results just seem odd and we don't know why.

    Agree on the Marek statue. Questions is where it should be built so that everyone could get to it. Maybe we could put a little picture of Marek in the lower right corner of the screen when running open drivers, just like the <testing> icon fglrx displays on untested hardware
    Especially if you're running r300g! You should get like a splash screen at bootup saying: All hail to Marek! when r300g is loaded.
    I would visit a statue, too.

    Leave a comment:


  • duby229
    replied
    Oh shit!! Now we even have Bridgman bowing to the mighty Marek!

    Seriously You guys, both in AMD and independant developers, have been doing a fantastic job. I have a 6850 that I could only be slightly happier with. Keep up the good work guys.

    Power management and video decode is really all thats left. Well at least for my needs anyway.

    Leave a comment:


  • bridgman
    replied
    In fairness, we're not aware of anything wrong with the tests. The results just seem odd and we don't know why.

    Agree on the Marek statue. Questions is where it should be built so that everyone could get to it. Maybe we could put a little picture of Marek in the lower right corner of the screen when running open drivers, just like the <testing> icon fglrx displays on untested hardware

    Leave a comment:


  • crazycheese
    replied
    My suggestion is that developers issue a public recommendation/profile for Michael how to test properly and Michael reads own forums
    Result : everyone happy

    P.S.
    AMD should build a statue for Marek so everyone(incl. AMD) can ritually sacrifice him some "beers"

    Leave a comment:


  • bridgman
    replied
    The issue is that (a) the same card produced much higher frame rates with the open source driver 18 months ago -- high enough that MSAA would probably have been useful even at 1920x1080 and (b) users are reporting higher frame rates even today which suggests it's not as simple as an across-the-board regression.

    Could be that the older frame rates were wrong, or the new ones, or there's a partial regression, or some setting/configuration-dependent thing.

    Leave a comment:


  • Sidicas
    replied
    Originally posted by Kivada View Post
    Nah, when benching a GPU you want to run the games at as high a resolution and as high a graphical settings as the game offers so you can fully beat on the GPU and drivers as hard as possible.
    That's not true at all.. When you're talking 1080p + MSAA on an old card, the only thing you're testing is the memory bandwidth.. The CPU, GPU and GPU drivers are sitting there practically idle waiting for data to flow between the GPU and GPU memory..
    In order for anti-aliasing to work, geometry gets scaled up to double their resolution in GPU memory. An old card, running at 1080p with anti-aliasing is absurd.. It was never done then, and no gamer is going to do it now on an old card because the framerates produced are too low to be playable and it has nothing to do with the GPU chip or the driver.

    IMO, the benchmarks don't seem very useful at all.. He should be comparing low resolution + AA vs. high resolution without AA.. If high resolution looks better and is a lot faster, then you can conclude there is still some optimization work to be done in the AA.. AA has always been used by gamers as a way to get around needing to have a higher resolution display.

    The only thing the benchmark shows, in my opinion, is that old cards were badly memory bandwidth bottlenecked.. Which gamers already know. When those cards were out, nobody had 1080p screens and if they did, they didn't need any AA.
    Last edited by Sidicas; 01-16-2013, 06:51 AM.

    Leave a comment:


  • Kivada
    replied
    Originally posted by tomato View Post
    Michael did run the tests with FullHD screen (1920x1080), maybe that's it?
    Nah, when benching a GPU you want to run the games at as high a resolution and as high a graphical settings as the game offers so you can fully beat on the GPU and drivers as hard as possible.

    If this where a CPU test you'd do the opposite, you run it at minimum graphical settings at 640x480 or 800x600, whatever the bare minimum the game will allow you to run it at to see how fast the CPU can run the game engine without the GPU being the bottleneck.

    The GPU test is closer to how most people actually play their games though. I.E. as high a detail settings as possible while not falling below the 30FPS threshold where the game will get choppy and laggy.

    Leave a comment:

Working...
X