Announcement
Collapse
No announcement yet.
Open-Source ATI R600/700 Mesa 3D Performance
Collapse
X
-
Show her a Disney movie (24 fps) and then show her Voltron (12 fps) and ask her if she can tell the difference.
-
FWIW, my mom can't tell the difference between 10 fps and 20 fps. But she's not a pro-gamer.
Leave a comment:
-
I know that the human visual system is not that simple, but I said that in movies, most people _perceive_ fluid from 16 frames per second onwards.
Yes, of course fluidity has to do with blurring, so why is not synthetic blurring used in games, thereby fixing the framerate?
Anyway, I still maintain that >30 frames per second in games (or GUIs) is not useful.
Finally, the current trend in monitors is towards 120fps rather than 60fps. That's very good, as it allows the display of 3d stereo images at 60fps per eye - the minimum comfortable rate for active 3d glasses.
Edit:Go watch "Lord of the Rings" in a cinema, sit in the first row, and then concentrate during any panning landscape scene. Mountains jump around 2 metres at a time, making your head hurt.
Leave a comment:
-
Originally posted by perpetualrabbit View PostI know that the human visual system is not that simple, but I said that in movies, most people _perceive_ fluid from 16 frames per second onwards.
Go watch "Lord of the Rings" in a cinema, sit in the first row, and then concentrate during any panning landscape scene. Mountains jump around 2 metres at a time, making your head hurt.
Or is it actually used? Anyway, I still maintain that >30 frames per second in games (or GUIs) is not useful. Provided good enough picture quality, and a blurring algorithms seems to be needed for that.
And in most Quake-based games, physics are FPS-related. Quake3 players all cap their FPS to 125 exactly.
Pro players actually use CRTs with a 120Hz refresh rate.
So I say more frames is not better; more frames is waste.
the difference between 30 fps and 120 fps becomes obvious when your fps dips during very complex scenes.
Leave a comment:
-
PTS runs every test three times, doesn't it? What are the 3 values for the 2600pro?
I'd like to see these information in the graphs, too. Not sure how to do it in a good looking way, though.
Leave a comment:
-
Originally posted by BlackStar View PostSorry, that's bollocks. The human eye can detect flicker in frequencies >100Hz (look at your CRT with the corner of your eye, if you still have one). Pilots have been able to identify planes by seeing a picture for 1/220th of a second.
The only reason 24Hz is acceptable in film, is because the filming process introduces motion blur into the picture, which fools the eye into a fluid perception of motion (when it is all but fluid in reality). Remove motion blur, like in a PC monitor, and you'll be able to tell the difference between 60, 85 and 120fps even under best-case conditions (stable frame intervals that match the screen refresh rate 1-1).
Edit: http://www.100fps.com/how_many_frame...humans_see.htm
I know that the human visual system is not that simple, but I said that in movies, most people _perceive_ fluid from 16 frames per second onwards. I can only perceive flicker in a crt looking sideways at it if the refresh is under 70 Hz. The pilot seeing a picture for 1/220th of a second actually sees blackness (or whiteness), then an aeroplane, then blackness(or whiteness) again. Integration over time yield a dim or vague picture of a plane. I bet your pilot cannot do it if you show him something else than a empty picture before and after the brief flash of the plane, like embed one frame of an aeroplane in a movie. Of course a flash of a fighter plane in the clouds is also a case of almost featureless images with a brief flash of something else. It is really not that special that pilots can recognize the type of plane from only a brief flash, since they are trained be able to recognize planes.
Yes, of course fluidity has to do with blurring, so why is not synthetic blurring used in games, thereby fixing the framerate?
Or is it actually used? Anyway, I still maintain that >30 frames per second in games (or GUIs) is not useful. Provided good enough picture quality, and a blurring algorithms seems to be needed for that.
So I say more frames is not better; more frames is waste.
Leave a comment:
-
Originally posted by colo View PostAm I the only one who thinks the graphing of the results is absolutely abhorrent? To get any valuable information out of the mess that is, for example, the GL and XV video performance graphs, they should have been at least two times their size. It'd also be nice to have access to the numeric data in tabular form as an option. More often than not, I'd like to compare numbers instead of trying hard to figure out which shade of $colour represents which card?
Those graphs are pretty hard to read and understand.
Leave a comment:
-
Originally posted by BlackStar View PostSorry, that's bollocks. The human eye can detect flicker in frequencies >100Hz (look at your CRT with the corner of your eye, if you still have one). Pilots have been able to identify planes by seeing a picture for 1/220th of a second.
The only reason 24Hz is acceptable in film, is because the filming process introduces motion blur into the picture, which fools the eye into a fluid perception of motion (when it is all but fluid in reality). Remove motion blur, like in a PC monitor, and you'll be able to tell the difference between 60, 85 and 120fps even under best-case conditions (stable frame intervals that match the screen refresh rate 1-1).
Edit: http://www.100fps.com/how_many_frame...humans_see.htm
Awesome link #1. A bit more detailed than that at 100fps.com.
Awesome link #2. Youtube video showing the difference between Motion Blur and regular redereing in GTA IV. Everybody should understand this one :P
Leave a comment:
-
Originally posted by droidhacker View PostI think that there are two things that it comes down to;
1) A game's framerate is VARIABLE. What you need is a WORST CASE framerate that is at LEAST some minimum.
2) (more importantly) point of sampling. If your game is running at 500 fps minimum, then it can get a frame sampling at very precise timing, resulting in fluidity in frames. I.e., it obviously isn't going to display 500 fps since that is beyond the capabilities of your display device, so it is going to pick SOME frames, spaced equally in time, and display those.
If your game is running at 30 fps and your screen at 60 fps, then you could get, worst case, THREE identical frames back-to-back followed by displaying the next frame only once (when it should be twice). I.e., things are variable, you expect it to get two identical frames, followed by two identical frames, followed by two identical frames, etc. If it is a little bit late preparing the third frame, then it gives THREE identical frames, then the FOURTH frame. It is a little difficult to explain without a picture... but this can lead to a choppy looking output.
Next, also consider that just because your game is running at "30 fps", it really doesn't mean that it is *actually* running at 30 fps *at this instant*. What it does to calculate the framerate is it counts how many frames HAVE BEEN generated over some amount of time (which is obviously greater than zero), and divide by the time. So you're looking at the AVERAGE framerate within a particular window of time. If the window happens to be 10 seconds, then it *could* have rendered 500 fps for a few seconds, followed by 0 fps for a few seconds, and it averaged out to 30 fps. This is an extreme example, of course, but it illustrates one of the issues with framerate since this example will obviously be real ugly to watch.
In contrast, for filming MOVIES, you basically have an INFINITE framerate (i.e. real world fluidity) that you sample from at a fixed rate. There is no question about when the next frame will happen to finish being rendered -- it is rendered and fully up to date at whatever moment you decide to sample it. And in computer generated videos, you can render at any rate you like, as long as the motion between frames makes sense for the time between them. In other words, there is no constraint that rendering has to be done in real time as it is in a game.
@2 Ok, so you pick equally spaced pictures out of a large number. That still seems a waste, as you could also use the spare cycles to calculate a picture in the middle of the 1/30th second interval and apply a good blurring algorithm on it. Using the previous, current and next picture or maybe if your calculate 60 frames per second using some other intelligent interpolation (blurring) algorithm. I know that Pixar does something like this in their computer animated movies.
Anyway blurring or interpolation would fall in the category of "using the spare cycles to improving picture quality". Still you would only display 30 frames per second, which seems to be enough for even Avatar (although that is 30 frames per eye).
Leave a comment:
Leave a comment: