Announcement

Collapse
No announcement yet.

ATI Radeon HD 4850 512MB

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Michael, would you mind posting some comparative windows results on the same setup?

    Comment


    • #32
      Originally posted by mtippett View Post
      [*]What is the optimal framerate for game play (Maximum - 100+, or 50-100)
      anyone who says they need 100+ is lying out there arse seeing as LCD's refresh at 60Hz so anything about that is pointless (not to mention the human eye can't detect changes that quickly). As for the optimal framerate it tends to depend heavily on the game, some will need 40+ others you can get by on low 20's (games with a lot of motion blur generally can get away with lower frame rates)

      [*]How important is Image Quality (AA/AF)? If the framerate acceptable, is it the way to go?
      I prefer to have at least x2/x4(AA/AF) but not using it so that a playable FPS with higher settings else can be achieved is fine.
      [*]How important is resolution? Is bigger really better?
      Higher is better! I currently use a 1920*1200 although if I had the money I grab a 30" LCD (or two)...

      Comment


      • #33
        Originally posted by Aradreth View Post
        anyone who says they need 100+ is lying out there arse seeing as LCD's refresh at 60Hz so anything about that is pointless (not to mention the human eye can't detect changes that quickly). As for the optimal framerate it tends to depend heavily on the game, some will need 40+ others you can get by on low 20's (games with a lot of motion blur generally can get away with lower frame rates)
        60fps vs. 100fps does "feel" different. Of course, where this is most noticeable (and where it matters) is in FPS games. Try playing something like Nexuiz with a 60fps cap and then with +100fps and tell me you don't notice a difference.

        Comment


        • #34
          Originally posted by Melcar View Post
          60fps vs. 100fps does "feel" different. Of course, where this is most noticeable (and where it matters) is in FPS games. Try playing something like Nexuiz with a 60fps cap and then with +100fps and tell me you don't notice a difference.
          I would agree to some extent that this could be true. Not because you can detect the difference, but when the rate dips and changes you don't notice the stalling (since it is beyond what the monitor or your eye can detect.)



          Shows the variance. (That is also why I said 50-100 FPS in my original post. Since all games show some variance.

          Regards,

          Matthew

          Comment


          • #35
            Originally posted by Michael View Post
            New results coming out next week show a much greater performance delta (and the 4850/4870 pulling the lead over the 9800GTX) next week with various AA / AF levels.


            These results so far are disappointing for the Linux crowd. I know I am at this time. When the driver catches up on performance I'm surely going to get my hands on a 4870 not too long after =]

            I wonder if I'll be able to sell my 3650 for 50$ too, it has 512mbs which seems like a good selling point after watching the 128megs guy:

            Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.

            Comment


            • #36
              I don't think this was a driver issue -- the cards were just too fast for the tests, so they ended up driver and CPU limited. Like running with a slow CPU, but... um... faster

              Once the GPUs are given more work to do (cranking up AA, AF etc..) you should start to see the hardware make more of a difference. Having a Linux port of Crysis would be nice too
              Last edited by bridgman; 28 June 2008, 03:23 PM.
              Test signature

              Comment


              • #37
                bridgman, do you have an idea when the wine bugs will be resolved? I'm still waiting for that to say bye bye to NVIDIA

                Also, how can the result be CPU dependant in tests where the NVIDIA cards get higher FPS?

                (Although I do agree we'll have to find native Linux games that stress the cards more, testing +100FPS games isn't telling much, and that's in heavy GPU loads that the card will show its value compared to the previous generation)

                Comment


                • #38
                  Most Linux games are rather CPU heavy anyways. You have also the fact that they usually don't use or take advantage of the more advanced features current hardware is capable of.

                  Comment


                  • #39
                    re: Wine, dunno but we'll talk to the Wine folks and see what the issues are. Sometimes the issue is actual bugs, sometimes it's just a case of "they coded around the other guys bugs" and not wanting to have to support multiple code paths. Unfortunately these days multiple code paths are a fact of life -- NVidia, Intel and ATI/AMD all have somewhat different approaches to GPU design. I imagine we have some bugs or missing functionality too but that should be largely gone now. We'll have to see.

                    re: different FPS in a CPU limited case, that's just a question of how much work each driver has to do for a specific workload, which in turn is driven by different hardware decisions, mostly related to "how you make the GPU do stuff".

                    Back when CPUs were slower and you could get CPU limited at 40 FPS your "infinitely fast hardware" driver speed made a noticeable difference in overall performance and every HW vendor put a lot of effort into this area (both HW and SW), but once the CPU-limited frame rate moved way above the display refresh frequencies it stopped being an issue.

                    Since this is the part of the driver code which is largely independent of how much work the app is pushing through the chip it tends to be an even smaller part of the overall performance equation on the "minimum FPS" frames where the app is sticking extra work into the GPU.

                    So.. bottom line is that this is an "interesting" measurement and potentially more relevant than, say, whether Ruby is hotter than Dawn, but as long as the CPU-limited frame rate is well above the display refresh rate it doesn't really mean much to the user experience. If you look at benchmark numbers across the industry you will see more and more cases of performance being CPU limited at low resolutions and IQ levels, but since the test is typically running at 150 FPS or more it doesn't matter in the end.
                    Last edited by bridgman; 28 June 2008, 03:58 PM.
                    Test signature

                    Comment


                    • #40
                      bridgman : yes, as silly as it may sound, wine is going to be a strong differentiating factor here, especially since even low end cards can run natives games without a sweat, while recent games in wine are more stressing for the graphic cards.

                      That, and hardware accelerated encoding - x264 use of Radeon HD would score a big win, especially since CUDA is getting lots of publicity (so sad to see almost no reports of Cinema 2.0, as silly as the name can be - NVIDIA is still eons ahead in its way to handle the press and the buzz).

                      Comment

                      Working...
                      X