Announcement

Collapse
No announcement yet.

Open-Source ATI R600/700 Mesa 3D Performance

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    I think that there are two things that it comes down to;

    1) A game's framerate is VARIABLE. What you need is a WORST CASE framerate that is at LEAST some minimum.

    2) (more importantly) point of sampling. If your game is running at 500 fps minimum, then it can get a frame sampling at very precise timing, resulting in fluidity in frames. I.e., it obviously isn't going to display 500 fps since that is beyond the capabilities of your display device, so it is going to pick SOME frames, spaced equally in time, and display those.

    If your game is running at 30 fps and your screen at 60 fps, then you could get, worst case, THREE identical frames back-to-back followed by displaying the next frame only once (when it should be twice). I.e., things are variable, you expect it to get two identical frames, followed by two identical frames, followed by two identical frames, etc. If it is a little bit late preparing the third frame, then it gives THREE identical frames, then the FOURTH frame. It is a little difficult to explain without a picture... but this can lead to a choppy looking output.

    Next, also consider that just because your game is running at "30 fps", it really doesn't mean that it is *actually* running at 30 fps *at this instant*. What it does to calculate the framerate is it counts how many frames HAVE BEEN generated over some amount of time (which is obviously greater than zero), and divide by the time. So you're looking at the AVERAGE framerate within a particular window of time. If the window happens to be 10 seconds, then it *could* have rendered 500 fps for a few seconds, followed by 0 fps for a few seconds, and it averaged out to 30 fps. This is an extreme example, of course, but it illustrates one of the issues with framerate since this example will obviously be real ugly to watch.

    In contrast, for filming MOVIES, you basically have an INFINITE framerate (i.e. real world fluidity) that you sample from at a fixed rate. There is no question about when the next frame will happen to finish being rendered -- it is rendered and fully up to date at whatever moment you decide to sample it. And in computer generated videos, you can render at any rate you like, as long as the motion between frames makes sense for the time between them. In other words, there is no constraint that rendering has to be done in real time as it is in a game.

    Comment


    • #47
      Originally posted by perpetualrabbit View Post
      As a non-gamer, I wonder why the framerate in games is so important.
      Some games clip the displayed framerate to a "sane" value, and/or sync the frames to the monitor vblanc.

      Other games do not use any framerate limit, or deactivate the limit for benchmarking your system. In this case, the framerate is only limited by your video driver/hardware. X11 might also limit the framerate, thats why glxgears is not considered to be a valid benchmarking tool. So more fps indicate better video drivers if you are not limited by your windowing system.

      The human eye perceives pictures as fluid motion when more than about 16 to 24 frames per second. So half the refresh rate of a TFT screen (usually 60Hz), at 30Hz should be more than enough. Why is it not always done that way? Why not use the CPU and GPU cycles for improving the picture quality instead of more frames per second.
      Your eye does more than receiving 24 frames per second. It "integrates" the light intensity for each frame. A video camera usually does the same, but if you render 24 sharp snapshots per second, a fast motion will not appear fluid. Thats why 60fps may appear more fluid even when 24 frames should be enough. You could call this an aliasing effect in time.

      You can counter this effect by simply displaying more frames and let the eye integrate these frames again, or composing multiple frames to a single frame which is then motion blurred. The first approach does not require any special features of your graphics hardware or additional work on the rendering code, only raw GPU power.

      Also, is there a way to fix the framerate in games to a certain amount (say 30Hz) and do benchmarking based on (lesser is better) CPU and GPU load instead?
      Why? Measuring the fps for benchmarking is a simple and reliable approach. You just have to remember that your application should run GPU limited. So if glxgears gives you 6000fps, which is probably X11 limited, use another application with more demands that runs at 20fps and use this number as benchmark.

      Comment


      • #48
        Originally posted by perpetualrabbit View Post
        A question to Michael or whoever else cares to provide insight:

        As a non-gamer, I wonder why the framerate in games is so important. The human eye perceives pictures as fluid motion when more than about 16 to 24 frames per second. So half the refresh rate of a TFT screen (usually 60Hz), at 30Hz should be more than enough. Or maximally 60Hz when doing 3D, so 30 Hz for each eye.
        FYI; Everything below 100Hz is percievable by the human brain. Which is why 100Hz CRT screens appear to be flikker-free and TL tubes (50Hz each) are usually placed in pares so you don't see the flicker.

        And then there is the subconcience. Not really unimportant when competing against other humans online :P And what has been said before: while the concience part of your brain only gets to process about 30 images/sec, what it recieves is the avarage all the images larger than the amount of 30 which these 30 images/s are made of that the concience part of your brain processes.

        Also, is there a way to fix the framerate in games to a certain amount (say 30Hz) and do benchmarking based on (lesser is better) CPU and GPU load instead?
        It is called frame limiting (often done with verticle sync -> 60fps) and it is possible. But yeah, that would be very interesting indeed. Don't know if that's so easily tested, BTW...

        Comment


        • #49
          Originally posted by perpetualrabbit View Post
          A question to Michael or whoever else cares to provide insight:

          As a non-gamer, I wonder why the framerate in games is so important. The human eye perceives pictures as fluid motion when more than about 16 to 24 frames per second.
          Sorry, that's bollocks. The human eye can detect flicker in frequencies >100Hz (look at your CRT with the corner of your eye, if you still have one). Pilots have been able to identify planes by seeing a picture for 1/220th of a second.

          The only reason 24Hz is acceptable in film, is because the filming process introduces motion blur into the picture, which fools the eye into a fluid perception of motion (when it is all but fluid in reality). Remove motion blur, like in a PC monitor, and you'll be able to tell the difference between 60, 85 and 120fps even under best-case conditions (stable frame intervals that match the screen refresh rate 1-1).

          Edit: http://www.100fps.com/how_many_frame...humans_see.htm

          Comment


          • #50
            Originally posted by droidhacker View Post
            I think that there are two things that it comes down to;

            1) A game's framerate is VARIABLE. What you need is a WORST CASE framerate that is at LEAST some minimum.

            2) (more importantly) point of sampling. If your game is running at 500 fps minimum, then it can get a frame sampling at very precise timing, resulting in fluidity in frames. I.e., it obviously isn't going to display 500 fps since that is beyond the capabilities of your display device, so it is going to pick SOME frames, spaced equally in time, and display those.

            If your game is running at 30 fps and your screen at 60 fps, then you could get, worst case, THREE identical frames back-to-back followed by displaying the next frame only once (when it should be twice). I.e., things are variable, you expect it to get two identical frames, followed by two identical frames, followed by two identical frames, etc. If it is a little bit late preparing the third frame, then it gives THREE identical frames, then the FOURTH frame. It is a little difficult to explain without a picture... but this can lead to a choppy looking output.

            Next, also consider that just because your game is running at "30 fps", it really doesn't mean that it is *actually* running at 30 fps *at this instant*. What it does to calculate the framerate is it counts how many frames HAVE BEEN generated over some amount of time (which is obviously greater than zero), and divide by the time. So you're looking at the AVERAGE framerate within a particular window of time. If the window happens to be 10 seconds, then it *could* have rendered 500 fps for a few seconds, followed by 0 fps for a few seconds, and it averaged out to 30 fps. This is an extreme example, of course, but it illustrates one of the issues with framerate since this example will obviously be real ugly to watch.

            In contrast, for filming MOVIES, you basically have an INFINITE framerate (i.e. real world fluidity) that you sample from at a fixed rate. There is no question about when the next frame will happen to finish being rendered -- it is rendered and fully up to date at whatever moment you decide to sample it. And in computer generated videos, you can render at any rate you like, as long as the motion between frames makes sense for the time between them. In other words, there is no constraint that rendering has to be done in real time as it is in a game.
            @1 Why pick "some" frame at random in a 1/30th second interval rather than calculate the exact picture in the middle of the interval? I don't see why calculating 500 pictures and then picking the best somewhere in the middle of the stack of 500 would be better?

            @2 Ok, so you pick equally spaced pictures out of a large number. That still seems a waste, as you could also use the spare cycles to calculate a picture in the middle of the 1/30th second interval and apply a good blurring algorithm on it. Using the previous, current and next picture or maybe if your calculate 60 frames per second using some other intelligent interpolation (blurring) algorithm. I know that Pixar does something like this in their computer animated movies.
            Anyway blurring or interpolation would fall in the category of "using the spare cycles to improving picture quality". Still you would only display 30 frames per second, which seems to be enough for even Avatar (although that is 30 frames per eye).

            Comment


            • #51
              Originally posted by BlackStar View Post
              Sorry, that's bollocks. The human eye can detect flicker in frequencies >100Hz (look at your CRT with the corner of your eye, if you still have one). Pilots have been able to identify planes by seeing a picture for 1/220th of a second.

              The only reason 24Hz is acceptable in film, is because the filming process introduces motion blur into the picture, which fools the eye into a fluid perception of motion (when it is all but fluid in reality). Remove motion blur, like in a PC monitor, and you'll be able to tell the difference between 60, 85 and 120fps even under best-case conditions (stable frame intervals that match the screen refresh rate 1-1).

              Edit: http://www.100fps.com/how_many_frame...humans_see.htm
              You beat me there, I was about to post the same link. HOWEVER! I'm a resourceful man:

              Awesome link #1. A bit more detailed than that at 100fps.com.

              Awesome link #2. Youtube video showing the difference between Motion Blur and regular redereing in GTA IV. Everybody should understand this one :P

              Comment


              • #52
                Originally posted by colo View Post
                Am I the only one who thinks the graphing of the results is absolutely abhorrent? To get any valuable information out of the mess that is, for example, the GL and XV video performance graphs, they should have been at least two times their size. It'd also be nice to have access to the numeric data in tabular form as an option. More often than not, I'd like to compare numbers instead of trying hard to figure out which shade of $colour represents which cardů
                Perfectly agree!
                Those graphs are pretty hard to read and understand.

                Comment


                • #53
                  Originally posted by BlueKoala View Post
                  Think the 2600 was running hot maybe?
                  That wouldn't explain the increased CPU usage with that GPU. If it was running at a lower clock speed, it would have lower GPU performance, but the CPU usage should be roughly the same.

                  Comment


                  • #54
                    Originally posted by BlackStar View Post
                    Sorry, that's bollocks. The human eye can detect flicker in frequencies >100Hz (look at your CRT with the corner of your eye, if you still have one). Pilots have been able to identify planes by seeing a picture for 1/220th of a second.

                    The only reason 24Hz is acceptable in film, is because the filming process introduces motion blur into the picture, which fools the eye into a fluid perception of motion (when it is all but fluid in reality). Remove motion blur, like in a PC monitor, and you'll be able to tell the difference between 60, 85 and 120fps even under best-case conditions (stable frame intervals that match the screen refresh rate 1-1).

                    Edit: http://www.100fps.com/how_many_frame...humans_see.htm
                    So calculate 30 blurred images per second with a good blurring algorithm. This might involve actually calculating 60 images and then doing some kind of weighted interpolation between the current, the previous two and next two "still frames", producing a current "blurred frame".

                    I know that the human visual system is not that simple, but I said that in movies, most people _perceive_ fluid from 16 frames per second onwards. I can only perceive flicker in a crt looking sideways at it if the refresh is under 70 Hz. The pilot seeing a picture for 1/220th of a second actually sees blackness (or whiteness), then an aeroplane, then blackness(or whiteness) again. Integration over time yield a dim or vague picture of a plane. I bet your pilot cannot do it if you show him something else than a empty picture before and after the brief flash of the plane, like embed one frame of an aeroplane in a movie. Of course a flash of a fighter plane in the clouds is also a case of almost featureless images with a brief flash of something else. It is really not that special that pilots can recognize the type of plane from only a brief flash, since they are trained be able to recognize planes.

                    Yes, of course fluidity has to do with blurring, so why is not synthetic blurring used in games, thereby fixing the framerate?
                    Or is it actually used? Anyway, I still maintain that >30 frames per second in games (or GUIs) is not useful. Provided good enough picture quality, and a blurring algorithms seems to be needed for that.
                    So I say more frames is not better; more frames is waste.

                    Comment


                    • #55
                      PTS runs every test three times, doesn't it? What are the 3 values for the 2600pro?
                      I'd like to see these information in the graphs, too. Not sure how to do it in a good looking way, though.

                      Comment


                      • #56
                        Originally posted by perpetualrabbit View Post
                        I know that the human visual system is not that simple, but I said that in movies, most people _perceive_ fluid from 16 frames per second onwards.
                        This is not true. You can perceive it under certain conditions, and this is due to the motion blur inherent in the camera filming process, but even then, the "smoothness" will depend on the size and speed of the motion.

                        Go watch "Lord of the Rings" in a cinema, sit in the first row, and then concentrate during any panning landscape scene. Mountains jump around 2 metres at a time, making your head hurt.

                        Or is it actually used? Anyway, I still maintain that >30 frames per second in games (or GUIs) is not useful. Provided good enough picture quality, and a blurring algorithms seems to be needed for that.
                        More frames per second ensure that you have faster response times in most games I'm aware of. Perhaps this could be avoided through more clever programming, I don't know, but it remains a fact.

                        And in most Quake-based games, physics are FPS-related. Quake3 players all cap their FPS to 125 exactly.

                        Pro players actually use CRTs with a 120Hz refresh rate.

                        So I say more frames is not better; more frames is waste.
                        Not all frames are equally difficult to render. Rendering as many as you can is a much more robust approach than trying to guess how much you should blur this or that, when there's no way to predict how long it will take to render the next frame.

                        the difference between 30 fps and 120 fps becomes obvious when your fps dips during very complex scenes.

                        Comment


                        • #57
                          I know that the human visual system is not that simple, but I said that in movies, most people _perceive_ fluid from 16 frames per second onwards.
                          Movies are actually displayed at 72fps to combat flicker (each frame is displayed 3 times). 3d movies (with glasses) run at 144fps.

                          Yes, of course fluidity has to do with blurring, so why is not synthetic blurring used in games, thereby fixing the framerate?
                          Because motion blur (or temporal antialiasing) was impossible to implement until pretty recently. Crysis was the first game I've seen with believable motion blur. GTA4 is also cool but earlier attempts, like GTA3, were nausea-inducing (ugly!)

                          Anyway, I still maintain that >30 frames per second in games (or GUIs) is not useful.
                          That's not correct, as any FPS gamer will atest. At 30fps you run at 33.3ms per frame, which leaves you 16.6ms behind 60fps players (that's exactly one 60fps frame behind.) This can and does affect the performance of good players.

                          Finally, the current trend in monitors is towards 120fps rather than 60fps. That's very good, as it allows the display of 3d stereo images at 60fps per eye - the minimum comfortable rate for active 3d glasses.

                          Edit:
                          Go watch "Lord of the Rings" in a cinema, sit in the first row, and then concentrate during any panning landscape scene. Mountains jump around 2 metres at a time, making your head hurt.
                          *Great* example. I was thinking about LOTR while writing my first reply.

                          Comment


                          • #58
                            FWIW, my mom can't tell the difference between 10 fps and 20 fps. But she's not a pro-gamer.

                            Comment


                            • #59
                              Show her a Disney movie (24 fps) and then show her Voltron (12 fps) and ask her if she can tell the difference.

                              Comment


                              • #60
                                Originally posted by perpetualrabbit View Post
                                So calculate 30 blurred images per second with a good blurring algorithm. This might involve actually calculating 60 images and then doing some kind of weighted interpolation between the current, the previous two and next two "still frames", producing a current "blurred frame".
                                Yeah great, add some client side milisecond lagg to a multiplayer game, <_<' Oh hey... my character responds half a second to late to my keayboard and mouse... awesome, just missed a headshot and got killed

                                I can only perceive flicker in a crt looking sideways at it if the refresh is under 70 Hz.
                                Even 75Hz is killing my brain.

                                The pilot seeing a picture for 1/220th of a second actually sees blackness (or whiteness), then an aeroplane, then blackness(or whiteness) again.
                                You don't have the slightest understanding of the algorythms/methods used by brains to percieve images, movement and position...

                                It is really not that special that pilots can recognize the type of plane from only a brief flash, since they are trained be able to recognize planes.
                                Fiiiiiiiiiiighterpiloooooooootsssss >.<

                                Yes, of course fluidity has to do with blurring, so why is not synthetic blurring used in games, thereby fixing the framerate?
                                Because it's not blur but smudge and doesn't do anything but fsck up the visual information.

                                Or is it actually used?
                                Yes, in console games with suckage framerate, like GTA IV on the PS3...

                                Anyway, I still maintain that >30 frames per second in games (or GUIs) is not useful. Provided good enough picture quality, and a blurring algorithms seems to be needed for that.
                                So I say more frames is not better; more frames is waste.
                                Okey... Multiplayer shooters:

                                You are playing against a friend. One shot kills. You and your friend run into each other in a halway, opposite ends.

                                You have latency from traffic back and forth with a server you are playing on.
                                You have latency from one game loop (untill your CPU has processed the data and then the GPU writes to the framebuffer).
                                You have latency from the x amount of Hz clock from the framebuffer to the screen.
                                You have latency from the x amount of Hz clock that the screen refreshes its input.

                                Then you and your friend both hit the mouse button upon seeing each other...

                                First come, first served.

                                Still think the amount of extra frames doesn't matter? Still think that latency from vsync is a good idea?

                                PS: sorry for the rant

                                Comment

                                Working...
                                X