Announcement

Collapse
No announcement yet.

ATI Radeon HD 4850 512MB

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Michael, would you mind posting some comparative windows results on the same setup?

    Comment


    • #32
      Originally posted by mtippett View Post
      [*]What is the optimal framerate for game play (Maximum - 100+, or 50-100)
      anyone who says they need 100+ is lying out there arse seeing as LCD's refresh at 60Hz so anything about that is pointless (not to mention the human eye can't detect changes that quickly). As for the optimal framerate it tends to depend heavily on the game, some will need 40+ others you can get by on low 20's (games with a lot of motion blur generally can get away with lower frame rates)

      [*]How important is Image Quality (AA/AF)? If the framerate acceptable, is it the way to go?
      I prefer to have at least x2/x4(AA/AF) but not using it so that a playable FPS with higher settings else can be achieved is fine.
      [*]How important is resolution? Is bigger really better?
      Higher is better! I currently use a 1920*1200 although if I had the money I grab a 30" LCD (or two)...

      Comment


      • #33
        Originally posted by Aradreth View Post
        anyone who says they need 100+ is lying out there arse seeing as LCD's refresh at 60Hz so anything about that is pointless (not to mention the human eye can't detect changes that quickly). As for the optimal framerate it tends to depend heavily on the game, some will need 40+ others you can get by on low 20's (games with a lot of motion blur generally can get away with lower frame rates)
        60fps vs. 100fps does "feel" different. Of course, where this is most noticeable (and where it matters) is in FPS games. Try playing something like Nexuiz with a 60fps cap and then with +100fps and tell me you don't notice a difference.

        Comment


        • #34
          Originally posted by Melcar View Post
          60fps vs. 100fps does "feel" different. Of course, where this is most noticeable (and where it matters) is in FPS games. Try playing something like Nexuiz with a 60fps cap and then with +100fps and tell me you don't notice a difference.
          I would agree to some extent that this could be true. Not because you can detect the difference, but when the rate dips and changes you don't notice the stalling (since it is beyond what the monitor or your eye can detect.)

          http://www.anandtech.com/linux/showdoc.aspx?i=2229&p=6

          Shows the variance. (That is also why I said 50-100 FPS in my original post. Since all games show some variance.

          Regards,

          Matthew

          Comment


          • #35
            Originally posted by Michael View Post
            New results coming out next week show a much greater performance delta (and the 4850/4870 pulling the lead over the 9800GTX) next week with various AA / AF levels.


            These results so far are disappointing for the Linux crowd. I know I am at this time. When the driver catches up on performance I'm surely going to get my hands on a 4870 not too long after =]

            I wonder if I'll be able to sell my 3650 for 50$ too, it has 512mbs which seems like a good selling point after watching the 128megs guy:

            http://www.youtube.com/watch?v=SxXmPFxf2Jg

            Comment


            • #36
              I don't think this was a driver issue -- the cards were just too fast for the tests, so they ended up driver and CPU limited. Like running with a slow CPU, but... um... faster

              Once the GPUs are given more work to do (cranking up AA, AF etc..) you should start to see the hardware make more of a difference. Having a Linux port of Crysis would be nice too
              Last edited by bridgman; 06-28-2008, 03:23 PM.

              Comment


              • #37
                bridgman, do you have an idea when the wine bugs will be resolved? I'm still waiting for that to say bye bye to NVIDIA

                Also, how can the result be CPU dependant in tests where the NVIDIA cards get higher FPS?

                (Although I do agree we'll have to find native Linux games that stress the cards more, testing +100FPS games isn't telling much, and that's in heavy GPU loads that the card will show its value compared to the previous generation)

                Comment


                • #38
                  Most Linux games are rather CPU heavy anyways. You have also the fact that they usually don't use or take advantage of the more advanced features current hardware is capable of.

                  Comment


                  • #39
                    re: Wine, dunno but we'll talk to the Wine folks and see what the issues are. Sometimes the issue is actual bugs, sometimes it's just a case of "they coded around the other guys bugs" and not wanting to have to support multiple code paths. Unfortunately these days multiple code paths are a fact of life -- NVidia, Intel and ATI/AMD all have somewhat different approaches to GPU design. I imagine we have some bugs or missing functionality too but that should be largely gone now. We'll have to see.

                    re: different FPS in a CPU limited case, that's just a question of how much work each driver has to do for a specific workload, which in turn is driven by different hardware decisions, mostly related to "how you make the GPU do stuff".

                    Back when CPUs were slower and you could get CPU limited at 40 FPS your "infinitely fast hardware" driver speed made a noticeable difference in overall performance and every HW vendor put a lot of effort into this area (both HW and SW), but once the CPU-limited frame rate moved way above the display refresh frequencies it stopped being an issue.

                    Since this is the part of the driver code which is largely independent of how much work the app is pushing through the chip it tends to be an even smaller part of the overall performance equation on the "minimum FPS" frames where the app is sticking extra work into the GPU.

                    So.. bottom line is that this is an "interesting" measurement and potentially more relevant than, say, whether Ruby is hotter than Dawn, but as long as the CPU-limited frame rate is well above the display refresh rate it doesn't really mean much to the user experience. If you look at benchmark numbers across the industry you will see more and more cases of performance being CPU limited at low resolutions and IQ levels, but since the test is typically running at 150 FPS or more it doesn't matter in the end.
                    Last edited by bridgman; 06-28-2008, 03:58 PM.

                    Comment


                    • #40
                      bridgman : yes, as silly as it may sound, wine is going to be a strong differentiating factor here, especially since even low end cards can run natives games without a sweat, while recent games in wine are more stressing for the graphic cards.

                      That, and hardware accelerated encoding - x264 use of Radeon HD would score a big win, especially since CUDA is getting lots of publicity (so sad to see almost no reports of Cinema 2.0, as silly as the name can be - NVIDIA is still eons ahead in its way to handle the press and the buzz).

                      Comment


                      • #41
                        Originally posted by miles View Post
                        Also, how can the result be CPU dependant in tests where the NVIDIA cards get higher FPS?
                        There are two parts to a CPU limited game.
                        1. The game engine itself
                        2. The SW portion of the driver

                        Ignoring the game engine itself, it basically means that our driver is "heavier" than Nvidia's when the GPU not the limiting factor. This is where a lot of effort into driver optimization occurs. Remember that we are part way on our journey, and that the 9800GTX and is beaten by the 3870 on ET:QW under Linux).

                        In general, if you look at comparative results you will see one of two things occur.
                        1. A drop off as the HW class scales
                        2. A drop off as the resolution or IQ settings are increased.

                        If you don't see that then it is generally that the HW is not limiting. In the case of most of the tests in the 4850 article both NV and AMD both suffer this.

                        (Although I do agree we'll have to find native Linux games that stress the cards more, testing +100FPS games isn't telling much, and that's in heavy GPU loads that the card will show its value compared to the previous generation)
                        There are some engines that stress the HW quite strongly, hopefully we will see some of those (like lightsmark and unigine) appear in the not too distant future. Having Linux native engines lowers the bar considerably on making games available under Linux.

                        Hopefully UT3 will eventually come and it will allow the cards and drivers to be separated at default settings. We'll wait until next week to see Michael's review with AA+AF pushed way up.

                        http://www.phoronix.com/forums/showthread.php?t=11091

                        Comment


                        • #42
                          I bet the card would run just as fast on x8 PCIe 2.0. I'd like to see the PCIe lanes cabled off the mobo so you could customize how many lanes you use and how many slots or what sort of slots and the spacing of the slots-to give you as much room as you like; maybe a double case design with expansion slots front and back and a peg board so you could move the slots.

                          Right now, you get what you get, a fixed number and type of slots,hardwired PCIe lanes, and there's no room for the cards you do have slots for.

                          alexforcefive,

                          I bet the price of women's shoes or maybe a nice purse and some frilly underwear, compares better. It's just not fair......or is it? God save the Queen? Good God ya' all!!!

                          Live responsibly and please don't drink and breathe.

                          Comment


                          • #43
                            Originally posted by Aradreth View Post
                            anyone who says they need 100+ is lying out there arse seeing as LCD's refresh at 60Hz so anything about that is pointless (not to mention the human eye can't detect changes that quickly).
                            Actually LCD's do not refresh at any hertz. Pixels are on or off. Refresh rate is totally useless comparative in the digital realm.

                            Comment


                            • #44
                              Originally posted by deanjo View Post
                              Actually LCD's do not refresh at any hertz. Pixels are on or off. Refresh rate is totally useless comparative in the digital realm.
                              Yes and no.

                              The concept is more or less the same, since the pixel state change and the phospor illumination will only happen at the rate relative to your refresh rate. The refresh rate is the maximum rate at which the scanline will have new information presented. The phospor decay period and and minimum LCD transistor switch rate are also relatively important but these days can more or less be assumed to be the same as the refresh rate.

                              I still contend that an average rate above the refresh rate since it gives enough headroom above the refresh rate to ensure that even in graphically hard parts of the game the rate doesn't fall below the refresh rate.

                              I understand the concept of the eye not being able to distinguish rates faster than 25-30 Hz, but the eye does detect artefacts at rates up to around 70 Hz. This is why most people flicker flicker on CRTs at 60Hz. It is the eye detecting the gap between the phosphor decay, but if the user looks closely at the screen, they will still see a continual unbroken image, but still feel the difference.

                              On an LCD this particular issue becomes mostly irrelevant, since the pixels are turned on and off discretely, rather than charging phosphor with the electron beam with the associated decay. I believe this is really what you are talking about.

                              This is also the reason that most LCD TVs are now refreshing at 120Hz, since the image feels "sharper" and "more responsive". The eye can't discern the steps, but the secondary visual effects are still there.

                              Regards,

                              Matthew

                              Comment


                              • #45
                                Originally posted by mtippett View Post
                                I understand the concept of the eye not being able to distinguish rates faster than 25-30 Hz, but the eye does detect artefacts at rates up to around 70 Hz. This is why most people flicker flicker on CRTs at 60Hz. It is the eye detecting the gap between the phosphor decay, but if the user looks closely at the screen, they will still see a continual unbroken image, but still feel the difference.

                                The primary reason why people can detect flicker @ 60 Hz is because of the occasional synchronization of artificial light sources and the refresh rate of the monitor (in countries that have a 50 Hz powergrid this effect happens at that refresh rate). It is the same effect as to why when filming a crt screen with a camera you will see scan lines on the screen. This effect does not happen on LCD's as the pixel state (as you have mentioned) changes discreetly.

                                Comment

                                Working...
                                X