Originally posted by zboszor
View Post
Originally posted by skeevy420
View Post
If you go from 1 second to 1ms as an improvement is it 0.001% or 1000x better? (You can do it in 1000th of the time)
Slower units of time below one second are larger, the scale is flipped. 1 minute is faster than 60 minutes, 1 microsecond is faster than 1000 microseconds, but the difference is the unit boundary, 1 minute is 60 seconds, but 1,000 microseconds is 1ms.
In that sense 150 to 50 is equivalent to 50 to 150. What matters is the context:
- Latency reduced by ~66.7%, from 150ms to 50ms.
- Latency reduced down to just 1/3rd (33%) from 150ms to 50ms.
- Performance per ms improved by 3x (from 6.7 to 20 FPS).
- Performance per ms improved by 200% (implies a 100% baseline as 2x, so 200% == 3x).
- Latency is down to 50ms, a 300% improvement over the previous 150ms! (inaccurate % improvement, but reads better and understood just the same as a multiplier given the context)
Comment