Announcement

Collapse
No announcement yet.

Open-Source ATI R600/700 Mesa 3D Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • pingufunkybeat
    replied
    Show her a Disney movie (24 fps) and then show her Voltron (12 fps) and ask her if she can tell the difference.

    Leave a comment:


  • that guy
    replied
    FWIW, my mom can't tell the difference between 10 fps and 20 fps. But she's not a pro-gamer.

    Leave a comment:


  • BlackStar
    replied
    I know that the human visual system is not that simple, but I said that in movies, most people _perceive_ fluid from 16 frames per second onwards.
    Movies are actually displayed at 72fps to combat flicker (each frame is displayed 3 times). 3d movies (with glasses) run at 144fps.

    Yes, of course fluidity has to do with blurring, so why is not synthetic blurring used in games, thereby fixing the framerate?
    Because motion blur (or temporal antialiasing) was impossible to implement until pretty recently. Crysis was the first game I've seen with believable motion blur. GTA4 is also cool but earlier attempts, like GTA3, were nausea-inducing (ugly!)

    Anyway, I still maintain that >30 frames per second in games (or GUIs) is not useful.
    That's not correct, as any FPS gamer will atest. At 30fps you run at 33.3ms per frame, which leaves you 16.6ms behind 60fps players (that's exactly one 60fps frame behind.) This can and does affect the performance of good players.

    Finally, the current trend in monitors is towards 120fps rather than 60fps. That's very good, as it allows the display of 3d stereo images at 60fps per eye - the minimum comfortable rate for active 3d glasses.

    Edit:
    Go watch "Lord of the Rings" in a cinema, sit in the first row, and then concentrate during any panning landscape scene. Mountains jump around 2 metres at a time, making your head hurt.
    *Great* example. I was thinking about LOTR while writing my first reply.

    Leave a comment:


  • pingufunkybeat
    replied
    Originally posted by perpetualrabbit View Post
    I know that the human visual system is not that simple, but I said that in movies, most people _perceive_ fluid from 16 frames per second onwards.
    This is not true. You can perceive it under certain conditions, and this is due to the motion blur inherent in the camera filming process, but even then, the "smoothness" will depend on the size and speed of the motion.

    Go watch "Lord of the Rings" in a cinema, sit in the first row, and then concentrate during any panning landscape scene. Mountains jump around 2 metres at a time, making your head hurt.

    Or is it actually used? Anyway, I still maintain that >30 frames per second in games (or GUIs) is not useful. Provided good enough picture quality, and a blurring algorithms seems to be needed for that.
    More frames per second ensure that you have faster response times in most games I'm aware of. Perhaps this could be avoided through more clever programming, I don't know, but it remains a fact.

    And in most Quake-based games, physics are FPS-related. Quake3 players all cap their FPS to 125 exactly.

    Pro players actually use CRTs with a 120Hz refresh rate.

    So I say more frames is not better; more frames is waste.
    Not all frames are equally difficult to render. Rendering as many as you can is a much more robust approach than trying to guess how much you should blur this or that, when there's no way to predict how long it will take to render the next frame.

    the difference between 30 fps and 120 fps becomes obvious when your fps dips during very complex scenes.

    Leave a comment:


  • bugmenot
    replied
    PTS runs every test three times, doesn't it? What are the 3 values for the 2600pro?
    I'd like to see these information in the graphs, too. Not sure how to do it in a good looking way, though.

    Leave a comment:


  • perpetualrabbit
    replied
    Originally posted by BlackStar View Post
    Sorry, that's bollocks. The human eye can detect flicker in frequencies >100Hz (look at your CRT with the corner of your eye, if you still have one). Pilots have been able to identify planes by seeing a picture for 1/220th of a second.

    The only reason 24Hz is acceptable in film, is because the filming process introduces motion blur into the picture, which fools the eye into a fluid perception of motion (when it is all but fluid in reality). Remove motion blur, like in a PC monitor, and you'll be able to tell the difference between 60, 85 and 120fps even under best-case conditions (stable frame intervals that match the screen refresh rate 1-1).

    Edit: http://www.100fps.com/how_many_frame...humans_see.htm
    So calculate 30 blurred images per second with a good blurring algorithm. This might involve actually calculating 60 images and then doing some kind of weighted interpolation between the current, the previous two and next two "still frames", producing a current "blurred frame".

    I know that the human visual system is not that simple, but I said that in movies, most people _perceive_ fluid from 16 frames per second onwards. I can only perceive flicker in a crt looking sideways at it if the refresh is under 70 Hz. The pilot seeing a picture for 1/220th of a second actually sees blackness (or whiteness), then an aeroplane, then blackness(or whiteness) again. Integration over time yield a dim or vague picture of a plane. I bet your pilot cannot do it if you show him something else than a empty picture before and after the brief flash of the plane, like embed one frame of an aeroplane in a movie. Of course a flash of a fighter plane in the clouds is also a case of almost featureless images with a brief flash of something else. It is really not that special that pilots can recognize the type of plane from only a brief flash, since they are trained be able to recognize planes.

    Yes, of course fluidity has to do with blurring, so why is not synthetic blurring used in games, thereby fixing the framerate?
    Or is it actually used? Anyway, I still maintain that >30 frames per second in games (or GUIs) is not useful. Provided good enough picture quality, and a blurring algorithms seems to be needed for that.
    So I say more frames is not better; more frames is waste.

    Leave a comment:


  • agd5f
    replied
    Originally posted by BlueKoala View Post
    Think the 2600 was running hot maybe?
    That wouldn't explain the increased CPU usage with that GPU. If it was running at a lower clock speed, it would have lower GPU performance, but the CPU usage should be roughly the same.

    Leave a comment:


  • blackshard
    replied
    Originally posted by colo View Post
    Am I the only one who thinks the graphing of the results is absolutely abhorrent? To get any valuable information out of the mess that is, for example, the GL and XV video performance graphs, they should have been at least two times their size. It'd also be nice to have access to the numeric data in tabular form as an option. More often than not, I'd like to compare numbers instead of trying hard to figure out which shade of $colour represents which card?
    Perfectly agree!
    Those graphs are pretty hard to read and understand.

    Leave a comment:


  • yotambien
    replied
    Originally posted by BlackStar View Post
    Sorry, that's bollocks. The human eye can detect flicker in frequencies >100Hz (look at your CRT with the corner of your eye, if you still have one). Pilots have been able to identify planes by seeing a picture for 1/220th of a second.

    The only reason 24Hz is acceptable in film, is because the filming process introduces motion blur into the picture, which fools the eye into a fluid perception of motion (when it is all but fluid in reality). Remove motion blur, like in a PC monitor, and you'll be able to tell the difference between 60, 85 and 120fps even under best-case conditions (stable frame intervals that match the screen refresh rate 1-1).

    Edit: http://www.100fps.com/how_many_frame...humans_see.htm
    You beat me there, I was about to post the same link. HOWEVER! I'm a resourceful man:

    Awesome link #1. A bit more detailed than that at 100fps.com.

    Awesome link #2. Youtube video showing the difference between Motion Blur and regular redereing in GTA IV. Everybody should understand this one :P

    Leave a comment:


  • perpetualrabbit
    replied
    Originally posted by droidhacker View Post
    I think that there are two things that it comes down to;

    1) A game's framerate is VARIABLE. What you need is a WORST CASE framerate that is at LEAST some minimum.

    2) (more importantly) point of sampling. If your game is running at 500 fps minimum, then it can get a frame sampling at very precise timing, resulting in fluidity in frames. I.e., it obviously isn't going to display 500 fps since that is beyond the capabilities of your display device, so it is going to pick SOME frames, spaced equally in time, and display those.

    If your game is running at 30 fps and your screen at 60 fps, then you could get, worst case, THREE identical frames back-to-back followed by displaying the next frame only once (when it should be twice). I.e., things are variable, you expect it to get two identical frames, followed by two identical frames, followed by two identical frames, etc. If it is a little bit late preparing the third frame, then it gives THREE identical frames, then the FOURTH frame. It is a little difficult to explain without a picture... but this can lead to a choppy looking output.

    Next, also consider that just because your game is running at "30 fps", it really doesn't mean that it is *actually* running at 30 fps *at this instant*. What it does to calculate the framerate is it counts how many frames HAVE BEEN generated over some amount of time (which is obviously greater than zero), and divide by the time. So you're looking at the AVERAGE framerate within a particular window of time. If the window happens to be 10 seconds, then it *could* have rendered 500 fps for a few seconds, followed by 0 fps for a few seconds, and it averaged out to 30 fps. This is an extreme example, of course, but it illustrates one of the issues with framerate since this example will obviously be real ugly to watch.

    In contrast, for filming MOVIES, you basically have an INFINITE framerate (i.e. real world fluidity) that you sample from at a fixed rate. There is no question about when the next frame will happen to finish being rendered -- it is rendered and fully up to date at whatever moment you decide to sample it. And in computer generated videos, you can render at any rate you like, as long as the motion between frames makes sense for the time between them. In other words, there is no constraint that rendering has to be done in real time as it is in a game.
    @1 Why pick "some" frame at random in a 1/30th second interval rather than calculate the exact picture in the middle of the interval? I don't see why calculating 500 pictures and then picking the best somewhere in the middle of the stack of 500 would be better?

    @2 Ok, so you pick equally spaced pictures out of a large number. That still seems a waste, as you could also use the spare cycles to calculate a picture in the middle of the 1/30th second interval and apply a good blurring algorithm on it. Using the previous, current and next picture or maybe if your calculate 60 frames per second using some other intelligent interpolation (blurring) algorithm. I know that Pixar does something like this in their computer animated movies.
    Anyway blurring or interpolation would fall in the category of "using the spare cycles to improving picture quality". Still you would only display 30 frames per second, which seems to be enough for even Avatar (although that is 30 frames per eye).

    Leave a comment:

Working...
X