A question to Michael or whoever else cares to provide insight:
As a non-gamer, I wonder why the framerate in games is so important. The human eye perceives pictures as fluid motion when more than about 16 to 24 frames per second. So half the refresh rate of a TFT screen (usually 60Hz), at 30Hz should be more than enough. Or maximally 60Hz when doing 3D, so 30 Hz for each eye.
Why is it not always done that way?
Why not use the CPU and GPU cycles for improving the picture quality instead of more frames per second?
Also, is there a way to fix the framerate in games to a certain amount (say 30Hz) and do benchmarking based on (lesser is better) CPU and GPU load instead?
I'm really puzzled by this...
As a non-gamer, I wonder why the framerate in games is so important. The human eye perceives pictures as fluid motion when more than about 16 to 24 frames per second. So half the refresh rate of a TFT screen (usually 60Hz), at 30Hz should be more than enough. Or maximally 60Hz when doing 3D, so 30 Hz for each eye.
Why is it not always done that way?
Why not use the CPU and GPU cycles for improving the picture quality instead of more frames per second?
Also, is there a way to fix the framerate in games to a certain amount (say 30Hz) and do benchmarking based on (lesser is better) CPU and GPU load instead?
I'm really puzzled by this...
Comment