That'll teach me to read the article rather than scanning the results.
Originally Posted by Michael
Michael, excuse me please if it was already answered, but why did you use tests with different settings?
What I mean is - virtually none of the results intersect with earlier 25-way opensource comparison, but both are done on near identical hardware.
The closest test you done was Xonotic, but opensource got tested on "high" topmost, where closed-source starts from "ultra".
The only thing I noticed is that opensource driver has much more concentrated fps distribution, where catalyst jumps everywhere between 40 and 500 fps, coming with less than middle (for example 140 fps) average. This is observed by comparing Xonotic high vs ultra sections of the articles.
I feel like things like _Min frame rate_ ie graph frame rate analysis, is also important. Avg fps doesn't show things like microstutter or other problems encountered with GPUs.
For instance on the PC, I get solid 60fps with RAGE but the game stutters like a bitch because it caches from the hard drive. Even the 64bit version of the game (I have 8GB of memory) yet people with SSD don't have this problem. I remember fallout games having a similar issue with anti-aliasing enabled as well. I think therefore it's important to include the frame rate graph showing min frame rate, max and medium.
box plots ftw!!!
Originally Posted by b15hop
Someone needs to write the PTS code, hint hint nudge nudge: http://phoronix.com/forums/showthrea...in-all-GL-apps
Otherwise the only frame times are from engines that report it, ie OA/quake derivatives.
Should be easy for anyone to do considering PTS already supports generating line graphs out of frame times (for engines that expose it), etc.
Originally Posted by curaga
In terms of box plots, PTS also supports generating them. IIRC, the reason I don't use them by default in the UI is I couldn't make them look really nice / graphically pleasing,