Also: note that there could be anomalies. I would be happy to play a game that sticks around 50fps consistenly, and hits a couple frames at 3 fps at some point. But who would endure a game you can play at 50fps, with a stdev of 35? That game would consistently be in the vicinity of 50-35=15 FPS.
Michael, I think this can be worked around by means of dumping constant FPS graph!
Originally Posted by mendieta
This could work for every test that supports Min/Max.
Then, one could definitely see the anomalies.
Also, another approach improving current method could be to provide extra info in form of [minfps/percentage_of_time]...[average_fps]...[maxfps/percentage_of_time].
So, the graph would read:
which would give much much more in-depth view on framerate distribution without bulky FPS graphs; although FPSgraph would also be awesome thing.
most of the tests don't support constant FPS dumping by the upstream game/demo. Though PTS already supports frame showing, eg demo in vdrift-FPS-monitor
Originally Posted by brosis
Yea, I was also thinking about a Box-and-whisker plot, but with the line being the average, the box being the average's deviation between tests and the whiskers being the FPS range size or standard deviation.
Originally Posted by _ONH_
Using that current little white overlay that shows the deviation to show the min/max would be a smaller change from the current graphs. Then you'd just have to figure out a new way of showing the deviations - perhaps by adding dotted lines. Or you could extend the current bar graph with dotted lines to show the min/max i guess.
Obviously showing some kind of line graph to show the min/avg/max over time shows the most information. They're also a little harder to just glance at and require more thought to read them - especially if you add more than a couple options at once, like you sometimes do for Phoronix. The box and whiskers type is in between - most people are less familiar with them, so it takes longer to look at them, but they can at least show the same number of options that your current bar graphs do.
The question is why don't you do what the Windows game reviewers have been doing for a few years now in their completely closed source world? Nobody cares what the max FPS of any game is, the frame latency, the average FPS*, and minimum FPS* are what actually matter.
Originally Posted by Michael
* you have to round out the very high and low anomalies as well, sometimes the game's loading registers FPS far beyond the capability of the GPU for a second or 2, sometimes they spit out 0 when the game plays at well over 60 FPS.