If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
No announcement yet.
Windows 8 vs. Linux Graphics, Source Benchmarks Coming
It's generally a lot more effective when people provide test requests / feedback / suggestions prior to writing a given article than giving feedback afterwards when nothing can be changed until the next time such tests are done...
Something for the Haswell graphics test: a couple of people pointed this out in the comments last week, in the frame latency people are more concerned with maximum than average because that is what causes jitter:
To Michael: i think you're missing the purpose of latency benchmarking. It's not only to have a average reading, but what matters most is how many, how frequent and the severity of the frame spikes, because that's the major cause of "perceived lagging"(beyond networking), usually a slightly slower frame average but with a much more stable frame latency is preferable over an inflated fps average with lots of spikes(like we see on this review, windows have It worse.)
This is also why I prefer to play LoL on Linux even if the avg on windows is way higher.. I have frame latency problems even with VSync on and an apparent rock-solid 60fps
In the OpenArena latency graph Linux actually fared much better than Windows but that wasn't really obvious. It might be better to report a simple metric that condenses the data to a single value that represents "how many, how frequent and the severity of the frame spikes". I'm not sure what is best, what are other sites doing? The simplest way might be to just report the variance of the latency of each frame (0 is better)? Or add all the latencies of each frame multiplied by some weight that penalises higher values eg "sum(latency**2)". I'm sure others have some ideas.
+1, it would be better if the benchmarks were done on Xubuntu with compositing disabled.
To say that a benchmark is "better" you have to first answer the question: "What is the point of the benchmark?" If the point is to measure the performance of Ubuntu+XFCE, then using Xubuntu is a good idea. Perhaps there are more interesting questions that could be asked.
It's completely possible to measure this for all OpenGL apps just like the id engines do for their own frames. I've been sitting on a 84-line library that does just that, maybe I should publish that on github or something. Linux-only though.
The other thing, please do the other common representation of the latency graphs - a sorted graph, with marker lines for 50%, 90%, and 95% thresholds.
Chrisb, the other sites do that sorted graph. It lets you easily see that half of the frames were under X ms, and so on for 90% and 95%. It's a better way than a single max number.