We can ofcourse discuss the settings, and try to achieve the lowest jitter.
First of all, a kernel with "low latency desktop" enabled. You can edit Kconfig.hz and change 100 to 90. (search/replace).
You can add idle=poll and tick_skew=1 to boot options. And compile a shaved local kernel by doing "make localmodconfig" from a full distro kernel. (Also remember to turn of high_res_timers if enabled, because then hz will be 4016 or atleast high.)
CFS granularity should be set to 1158500nS. (Current setting is a bit high, and may give jitter)
Renice X to -20, to prevent X being a bottleneck.
Remove 60hz limit in doom 3, to avoid timing jitter. http://paradoxuncreated.com/tmp/AutoExec.cfg
Some nvidia-specific stuff on my blog aswell : http://paradoxuncreated.com/Blog/wordpress/?p=2268
A lot of people think SLAB has lower jitter than SLUB etc, aswell, and some tuning with regards to what kernel options execute the least code, least background activity can be done.
Rather than listen to someone moan and not want to realize reality, I`d much more like to see people understanding this, and contributing to an improvement. And I very much appreciate the initiative of the thread. However to really see doom 3 jitter, it must be run at normal speed. So just play through a level. and record those values instead, would be more informative.
Peace Be With You.
Last edited by Paradox Uncreated; 11-06-2012 at 10:01 AM.
Do you have an explanation for why a lower Hz setting improves jitter? I thought the general wisdom was that a higher setting would improve timing precision.
Originally Posted by Paradox Uncreated
Less interrupt triggered = less interruptions = less jitter. Depending on how things is done, if less interrupts means larger data bursts, there may be a sweet spot. I have found 90 hz, to give the least jitter, with 5hz accuracy.
Well maybe you should reboot your system and try 2 passes or 3 of the demo and compare. Be prepared that your thesis is wrong.
It would be interesting to test rational multiples (e.g. 2, 3, 5, 3/2, 4/3) of the monitor refresh rate for Hz (e.g. 60, 120, 180, 300, 90, 80).
Originally Posted by Paradox Uncreated
Also, if Phoronix Test Suite adds frame time graphs, it would be helpful to show the X axis as time for demos that run at a constant speed (i.e. not like Quake 3 timedemos). That way it's easier to tell if a long frame always happens in the same place in a demo.
Reboot? LOL. My "thesis"? Both the use of that word, and ignoring the fact that clear difference can be observed, simply by doing what I said above.
Originally Posted by Kano
Like Guano could refute anything of a thesis. LOL
Guano continues his completely mindless behaviour. How much nonsense of his, must be refuted before he realizes he is wrong? Ofcourse it could be that he is doing this, because he thinks he is "cool".
I run doom 3 completely without jitter. When running playdemo with the demo, it jitters in similar places, and the framerate is reduced.
A warning to all is the only thing Guano is.
Now instead of arguing the obvious with Guano, I am going to reply to people who actually want to discuss the problem, provide fixes, inspiration to solutions etc.
Leave cottaging guano-lover to himself, where he can run his performance with suboptimal settings, and get the performance of the ultimate low-end.
Ofcourse it could be possible to do vsynced HZ, and cleverly arranging things, so that each vsync, buffer is delivered, and next frame calculated. One for the kernel-engineers. (Who have time.)
Please keep it on-topic. The subject is about the TechReport article and how this could be applied to Linux systems testing.
What would then be measured in the y-range?
Originally Posted by unix_epoch
I believe that Doom 3 timedemos are frame-for-frame identical across machines, because delays happen always in the same frame. Moreover, timedemos always have the same frame lenght.
Tags for this Thread