Originally posted by Myownfriend
View Post
The proposals for reducing CPU usage later all amounted to delaying/coalescing input events again, somehow. It's an unavoidable tradeoff between latency and CPU usage. Either plugging a 1000 Hz mouse in causes the input code path to run 8x as often, or it doesn't.
And most of the cost is often on the application side. Firefox, which is especially bad, will burn up half a CPU core on my machine if you wave a 1000 Hz mouse over it. Now, you can and should argue that Firefox is poorly programmed in that respect, but every application would have to have its input stack fixed independently, and even if you could get the overhead down to the lower bound (which I think is 6 context switches per event, maybe 4 with io_uring), CPU wakeups have an irreducible cost due to charging up capacitances with energy you can't get back.
Most good 1000 Hz mice have a programmable button that can be assigned to switch the sampling rate, or even a physical switch. I found that 500 Hz was good enough for me most of the time. 1 ms average input lag and position jitter, rather than 0.5 ms, at half the CPU cost.
Leave a comment: