Originally posted by kraftman
View Post
@minuseins - Whilst I'd be all too happy to game at 10ms latency with regards to pretty much all my equipment, when faced with opponents who're suggesting the human eye can't see over 24fps, I think actually displaying the differences between 100 and 1000 are measurable would be beneficial. Today we generally sit with game engines that have 33-50ms tick rates and monitors which have a delay of around the same number, and getting anything faster is significantly harder than it was 10 years ago. The industry is going in the wrong direction because of the "good enoughs" myths, and as someone who'd like to get things faster, I don't think theres any issue with setting the ideal speeds fairly high, as long as it gets us moving in the right direction.
With regards to latency, modern games generally take very little network traffic, however you are right that there is quite a difference between the 33ms tick rates, and 1ms tick rates, you'd be talking about 1000 times as much traffic in a 32 man server. I'm not sure off the top of my head if that'd be completely unfeasible, but it would eliminate many from online gaming; as a competitive LAN player though, the option would be nice. On top of that, input lag is additional to network latency (and server tick rates) as you're responding to the data you've been given. As you previously said, you're already behind, so limiting any further input lag is still beneficial, and shouldn't be ignored. Alas the competitive gamer doesn't get a whole lot of say in the matter, so at best I can cross my figures and hope they don't artifically limit those OLED (if they ever release computer monitors) to stupidly low refresh rates (ie, 60hz, which freaking sucks).
Comment