Announcement

Collapse
No announcement yet.

Open-Source ATI R600/700 Mesa 3D Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by BlackStar View Post
    Sorry, that's bollocks. The human eye can detect flicker in frequencies >100Hz (look at your CRT with the corner of your eye, if you still have one). Pilots have been able to identify planes by seeing a picture for 1/220th of a second.

    The only reason 24Hz is acceptable in film, is because the filming process introduces motion blur into the picture, which fools the eye into a fluid perception of motion (when it is all but fluid in reality). Remove motion blur, like in a PC monitor, and you'll be able to tell the difference between 60, 85 and 120fps even under best-case conditions (stable frame intervals that match the screen refresh rate 1-1).

    Edit: http://www.100fps.com/how_many_frame...humans_see.htm
    You beat me there, I was about to post the same link. HOWEVER! I'm a resourceful man:

    Awesome link #1. A bit more detailed than that at 100fps.com.

    Awesome link #2. Youtube video showing the difference between Motion Blur and regular redereing in GTA IV. Everybody should understand this one :P

    Comment


    • #52
      Originally posted by colo View Post
      Am I the only one who thinks the graphing of the results is absolutely abhorrent? To get any valuable information out of the mess that is, for example, the GL and XV video performance graphs, they should have been at least two times their size. It'd also be nice to have access to the numeric data in tabular form as an option. More often than not, I'd like to compare numbers instead of trying hard to figure out which shade of $colour represents which card?
      Perfectly agree!
      Those graphs are pretty hard to read and understand.

      Comment


      • #53
        Originally posted by BlueKoala View Post
        Think the 2600 was running hot maybe?
        That wouldn't explain the increased CPU usage with that GPU. If it was running at a lower clock speed, it would have lower GPU performance, but the CPU usage should be roughly the same.

        Comment


        • #54
          Originally posted by BlackStar View Post
          Sorry, that's bollocks. The human eye can detect flicker in frequencies >100Hz (look at your CRT with the corner of your eye, if you still have one). Pilots have been able to identify planes by seeing a picture for 1/220th of a second.

          The only reason 24Hz is acceptable in film, is because the filming process introduces motion blur into the picture, which fools the eye into a fluid perception of motion (when it is all but fluid in reality). Remove motion blur, like in a PC monitor, and you'll be able to tell the difference between 60, 85 and 120fps even under best-case conditions (stable frame intervals that match the screen refresh rate 1-1).

          Edit: http://www.100fps.com/how_many_frame...humans_see.htm
          So calculate 30 blurred images per second with a good blurring algorithm. This might involve actually calculating 60 images and then doing some kind of weighted interpolation between the current, the previous two and next two "still frames", producing a current "blurred frame".

          I know that the human visual system is not that simple, but I said that in movies, most people _perceive_ fluid from 16 frames per second onwards. I can only perceive flicker in a crt looking sideways at it if the refresh is under 70 Hz. The pilot seeing a picture for 1/220th of a second actually sees blackness (or whiteness), then an aeroplane, then blackness(or whiteness) again. Integration over time yield a dim or vague picture of a plane. I bet your pilot cannot do it if you show him something else than a empty picture before and after the brief flash of the plane, like embed one frame of an aeroplane in a movie. Of course a flash of a fighter plane in the clouds is also a case of almost featureless images with a brief flash of something else. It is really not that special that pilots can recognize the type of plane from only a brief flash, since they are trained be able to recognize planes.

          Yes, of course fluidity has to do with blurring, so why is not synthetic blurring used in games, thereby fixing the framerate?
          Or is it actually used? Anyway, I still maintain that >30 frames per second in games (or GUIs) is not useful. Provided good enough picture quality, and a blurring algorithms seems to be needed for that.
          So I say more frames is not better; more frames is waste.

          Comment


          • #55
            PTS runs every test three times, doesn't it? What are the 3 values for the 2600pro?
            I'd like to see these information in the graphs, too. Not sure how to do it in a good looking way, though.

            Comment


            • #56
              Originally posted by perpetualrabbit View Post
              I know that the human visual system is not that simple, but I said that in movies, most people _perceive_ fluid from 16 frames per second onwards.
              This is not true. You can perceive it under certain conditions, and this is due to the motion blur inherent in the camera filming process, but even then, the "smoothness" will depend on the size and speed of the motion.

              Go watch "Lord of the Rings" in a cinema, sit in the first row, and then concentrate during any panning landscape scene. Mountains jump around 2 metres at a time, making your head hurt.

              Or is it actually used? Anyway, I still maintain that >30 frames per second in games (or GUIs) is not useful. Provided good enough picture quality, and a blurring algorithms seems to be needed for that.
              More frames per second ensure that you have faster response times in most games I'm aware of. Perhaps this could be avoided through more clever programming, I don't know, but it remains a fact.

              And in most Quake-based games, physics are FPS-related. Quake3 players all cap their FPS to 125 exactly.

              Pro players actually use CRTs with a 120Hz refresh rate.

              So I say more frames is not better; more frames is waste.
              Not all frames are equally difficult to render. Rendering as many as you can is a much more robust approach than trying to guess how much you should blur this or that, when there's no way to predict how long it will take to render the next frame.

              the difference between 30 fps and 120 fps becomes obvious when your fps dips during very complex scenes.

              Comment


              • #57
                I know that the human visual system is not that simple, but I said that in movies, most people _perceive_ fluid from 16 frames per second onwards.
                Movies are actually displayed at 72fps to combat flicker (each frame is displayed 3 times). 3d movies (with glasses) run at 144fps.

                Yes, of course fluidity has to do with blurring, so why is not synthetic blurring used in games, thereby fixing the framerate?
                Because motion blur (or temporal antialiasing) was impossible to implement until pretty recently. Crysis was the first game I've seen with believable motion blur. GTA4 is also cool but earlier attempts, like GTA3, were nausea-inducing (ugly!)

                Anyway, I still maintain that >30 frames per second in games (or GUIs) is not useful.
                That's not correct, as any FPS gamer will atest. At 30fps you run at 33.3ms per frame, which leaves you 16.6ms behind 60fps players (that's exactly one 60fps frame behind.) This can and does affect the performance of good players.

                Finally, the current trend in monitors is towards 120fps rather than 60fps. That's very good, as it allows the display of 3d stereo images at 60fps per eye - the minimum comfortable rate for active 3d glasses.

                Edit:
                Go watch "Lord of the Rings" in a cinema, sit in the first row, and then concentrate during any panning landscape scene. Mountains jump around 2 metres at a time, making your head hurt.
                *Great* example. I was thinking about LOTR while writing my first reply.

                Comment


                • #58
                  FWIW, my mom can't tell the difference between 10 fps and 20 fps. But she's not a pro-gamer.

                  Comment


                  • #59
                    Show her a Disney movie (24 fps) and then show her Voltron (12 fps) and ask her if she can tell the difference.

                    Comment


                    • #60
                      Originally posted by perpetualrabbit View Post
                      So calculate 30 blurred images per second with a good blurring algorithm. This might involve actually calculating 60 images and then doing some kind of weighted interpolation between the current, the previous two and next two "still frames", producing a current "blurred frame".
                      Yeah great, add some client side milisecond lagg to a multiplayer game, <_<' Oh hey... my character responds half a second to late to my keayboard and mouse... awesome, just missed a headshot and got killed

                      I can only perceive flicker in a crt looking sideways at it if the refresh is under 70 Hz.
                      Even 75Hz is killing my brain.

                      The pilot seeing a picture for 1/220th of a second actually sees blackness (or whiteness), then an aeroplane, then blackness(or whiteness) again.
                      You don't have the slightest understanding of the algorythms/methods used by brains to percieve images, movement and position...

                      It is really not that special that pilots can recognize the type of plane from only a brief flash, since they are trained be able to recognize planes.
                      Fiiiiiiiiiiighterpiloooooooootsssss >.<

                      Yes, of course fluidity has to do with blurring, so why is not synthetic blurring used in games, thereby fixing the framerate?
                      Because it's not blur but smudge and doesn't do anything but fsck up the visual information.

                      Or is it actually used?
                      Yes, in console games with suckage framerate, like GTA IV on the PS3...

                      Anyway, I still maintain that >30 frames per second in games (or GUIs) is not useful. Provided good enough picture quality, and a blurring algorithms seems to be needed for that.
                      So I say more frames is not better; more frames is waste.
                      Okey... Multiplayer shooters:

                      You are playing against a friend. One shot kills. You and your friend run into each other in a halway, opposite ends.

                      You have latency from traffic back and forth with a server you are playing on.
                      You have latency from one game loop (untill your CPU has processed the data and then the GPU writes to the framebuffer).
                      You have latency from the x amount of Hz clock from the framebuffer to the screen.
                      You have latency from the x amount of Hz clock that the screen refreshes its input.

                      Then you and your friend both hit the mouse button upon seeing each other...

                      First come, first served.

                      Still think the amount of extra frames doesn't matter? Still think that latency from vsync is a good idea?

                      PS: sorry for the rant

                      Comment

                      Working...
                      X