There's a distinct reason there's a 60fps cap. Most humans can only discern 1/6th of that resolution. The UI atom has typically been 100msec. Larger feels laggy. Smaller's snappy, but the event loops on many machines until recently couldn't give it to you any better than 100msec. Same goes for framerate. You can't discern past 60fps as anything more than flicker unless you've trained yourself and then it's something like 80 or so fps as that threshold.
Originally Posted by xav1r
With CRT i could see flickering with 75 hz, disappeared with 85 hz. But as currently TFT are common 60 hz is enough. When you want a completely smooth vsync you should consider only integer divisors to sync. Thats the thing which is problematic with 24/25hz material for example when shown as 60 hz. Therefore for syncing with pal movie 100 hz for pal was my favorite choice - in order to sync 24 and 30 fps perfectly you would even need 120 hz. Thats for movies, back to games: when you update for example every 2nd frame with 30 fps it looks more smooth than when you update every 1.x frames, as without vsync you get the tearing lines due to different content and even without vsync you get update some frames after 2 frames delay and others after 1 frame. With mainly static content that will not be visable, but when you use fast turns you see it, therefore a real gamer needs framerate way over 60 hz in every case.
Last edited by Kano; 05-18-2009 at 04:36 PM.
Originally Posted by Kano
The average LCD happens to have a 5ms latency on the monitor.
The good stuff has a 2ms latency.
5ms is serviceable for most games with a refresh rate of ~20Hz.
2ms is good for most gaming with a refresh rate of ~60Hz.
Kind of puts this whole discussion in perspective, doesn't it?
Unless you've got sub-millisecond refresh (only some fairly high-end LCD's and Plasma displays currently do this to the best of my knowledge...), then throwing more FPS at the problem won't make the game play ANY better. If you're on a CRT, unbinding it will help some, but only until you hit the monitor's refresh rate, and then you're not getting any better out of it either.
Maybe you misunderstood me. You can use 30 or 60 fps to make a turn smooth, but when you try 40 or 50 fps it will jump. 20 fps is nothing you would ever want to see as the eye is faster. That the monitor has to be fast enough thats clear.
Last edited by Svartalf; 05-18-2009 at 06:01 PM.
60 fps is perfectly fine - best on a 60 or 120 hz displays. A more flexible engine can help in cases where the monitor has a differnet refreshrate but i think 30 or 60 fps restrictions are made mainly for multiplayer matches - with older engines those with faster systems had an advantage because they saw the enemy faster at the correct position. Thats why hardcore gamers only need high fps instead of high res and effects - you have to be trained of course to use the advantage. An untrained player can use the fastest system on earth and would still fail.
I meant in fact more driving a C backend calling into C++ classes or whatever. So calling from static functions into objects is easier than trying to wrap objects into static calls. But I agree with you there that templates are overdone and OO often too. One should also pay attention to virtual calls. One of the tricky bits in all my design: call virtual only if needed. Nothing beats though thinkingly aligned structs in sequential mode
I would stake the claim that a high resolution mouse and training helps more than higher framerates.
ok i've been quite away from my own thread and it's kinda off topic now... so here is my question again! What features will be available in Id tech 4 when open sourced?