I can very good detect the difference between 60 and 85 frames - that said running a normal 24" acer @60hz ! When it is higher than 80, its impossible to sense for me. However once some smoke or irregularity come to the game, the 200 fps one has lot of backing off, compared to 80 fps one. For example urban terror will slow down A LOT from 80 fps to 20 on hd4770 with opensource when bullets are fired, people fragged, or just more than 5 people on the screen, smoke is thrown or map is BIG. This significantly reduces chances to show any skill. My current leased 260 gtx sp 216 draws 125 fps (game limit) regardless of amount of details or 60 when forced to idle timings. But between 200 fps and 300 fps, the difference is more energy waste, card should actually downclock and maintain ~100 fps(120 with vsync).
This guy has severe problems with licensing. He advocates linux, tons of free software, but his own creation is one of the worst proprietary I ever saw (phoning home and tracking included). The demo reaches 20 fps (30 fps without composite) in kwin on my card (260 gtx sp216 1.7Gb), which gives his software another medal for worst optimization in the history. He should really talk to fsf, maybe they find him much better development model, both opensource and correcting his bugs. The engine looks like free pizza in proprietary iron box now.
Originally posted by Geri
View Post
Comment