Announcement

Collapse
No announcement yet.

Frame latency analysis on Doom 3

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Paradox Uncreated View Post
    Ofcourse it could be possible to do vsynced HZ, and cleverly arranging things, so that each vsync, buffer is delivered, and next frame calculated. One for the kernel-engineers. (Who have time.)
    That assumes the CPU/GPU can keep up, which in modern games they typically can't. Doom 3 IS a decade old after all; really no reason frame latency shouldn't be much above 10ms or so...[would be interesting to run a Windows comparison...]



    Frame latency is a better benchmarking tool then FPS, because FPS averages out the slow time periods, and minimum FPS can catch outliers while hiding the latency problem. "Microstutter" on multi-GPU conflgs, for instance, is QUITE noticeable, even as FPS reaches into the hundreds.

    Comment


    • #32
      Originally posted by gamerk2 View Post
      That assumes the CPU/GPU can keep up, which in modern games they typically can't. Doom 3 IS a decade old after all; really no reason frame latency shouldn't be much above 10ms or so...[would be interesting to run a Windows comparison...]



      Frame latency is a better benchmarking tool then FPS, because FPS averages out the slow time periods, and minimum FPS can catch outliers while hiding the latency problem. "Microstutter" on multi-GPU conflgs, for instance, is QUITE noticeable, even as FPS reaches into the hundreds.
      G(l)aymer2k chimes in and shows complete lack of understanding, and the incoherence of Guano. "That assumes.." He didn`t understand jitter in the other thread either. It`s ridicolous, it is a joke. Wherever these people work, avoid them like the pest.

      And then to go on to "frame latency" and "slow time periods". Never use these people as translators, to put it like that, because obviously it turns to shit in there.

      Comment


      • #33
        Originally posted by Paradox Uncreated View Post
        G(l)aymer2k chimes in and shows complete lack of understanding, and the incoherence of Guano. "That assumes.." He didn`t understand jitter in the other thread either. It`s ridicolous, it is a joke. Wherever these people work, avoid them like the pest.

        And then to go on to "frame latency" and "slow time periods". Never use these people as translators, to put it like that, because obviously it turns to shit in there.
        You continue to make the silly assumption that all forms of jitter are OS/kernel related. Games are more likely to suffer jitter due to H/W effects, rather then S/W.

        Comment


        • #34
          Originally posted by gamerk2 View Post
          You continue to make the silly assumption that all forms of jitter are OS/kernel related. Games are more likely to suffer jitter due to H/W effects, rather then S/W.
          You are obviously nuts. I guess I am just going to have to get used to all the nutters on the internutz.

          Comment


          • #35
            PS:

            I think I have sufficiently solved jitter now though. Doom 3 jitter is even lower with renice (-20) + my listed tweaks. So I feel there is little to improve. It is gliding silky smooth now. No frameloss, and timing jitter is so low, that I think it should be near impossible to see. So for my part, I don`t need any numbers, and they would need to be more verbose, than the option in doom 3 anyway. But try it. You wil see a big difference, and very enjoyable smooth frames. The trick with renice can also be used with webbrowser, to have less jitter on youtube videos, etc. Where I also recommend chromium, because it has the lowest jitter to begin with.

            Case solved!

            Peace Be With You.

            Comment


            • #36
              Originally posted by thofke View Post
              What would then be measured in the y-range?
              The Y axis would still be frame duration. You can plot the same exact data by using the timestamp of each frame instead of the frame number as the X coordinate for each data point.

              I believe that Doom 3 timedemos are frame-for-frame identical across machines, because delays happen always in the same frame. Moreover, timedemos always have the same frame lenght.
              Timedemos wouldn't need any change, but there are graphs on the linked Techreport article that show vastly different frame counts. Looking carefully on page 2 you can see the same pattern of spikes in different places on all four Radeon GPUs:



              Using the timestamp instead of frame number for the X coordinate would make spikes caused by game content line up, while spikes caused by the process getting interrupted would not line up.

              Comment


              • #37
                Originally posted by Paradox Uncreated View Post
                You are obviously nuts. I guess I am just going to have to get used to all the nutters on the internutz.
                It's obvious that deficiencies in the OS scheduler can cause just as much jitter as a hiccup in the GPU, but otherwise what gamer2k is saying makes sense to me.

                Can you define what you mean by jitter? I would define jitter as any time-varying variation between when a frame is expected to be displayed and when it is actually displayed (so, for example, a constant 33ms delay would not be jitter, but a delay that fluctuates between 0ms and 33ms would be jitter).

                Comment


                • #38
                  Originally posted by Paradox Uncreated View Post
                  I think I have sufficiently solved jitter now though. Doom 3 jitter is even lower with renice (-20) + my listed tweaks. So I feel there is little to improve. It is gliding silky smooth now. No frameloss, and timing jitter is so low, that I think it should be near impossible to see. So for my part, I don`t need any numbers, and they would need to be more verbose, than the option in doom 3 anyway. But try it. You wil see a big difference, and very enjoyable smooth frames. The trick with renice can also be used with webbrowser, to have less jitter on youtube videos, etc. Where I also recommend chromium, because it has the lowest jitter to begin with.

                  Case solved!

                  Peace Be With You.
                  good, now you can stop posting nonsense

                  Comment


                  • #39
                    Originally posted by unix_epoch View Post
                    It's obvious that deficiencies in the OS scheduler can cause just as much jitter as a hiccup in the GPU, but otherwise what gamer2k is saying makes sense to me.

                    Can you define what you mean by jitter? I would define jitter as any time-varying variation between when a frame is expected to be displayed and when it is actually displayed (so, for example, a constant 33ms delay would not be jitter, but a delay that fluctuates between 0ms and 33ms would be jitter).
                    Thats more or less correct. Granted, a constant 33ms latency wouldn't exactly be smooth either (a frame would be created one cycle, repeated the next as the next frame isn't ready, then the next one displayed on the third cycle), but because the rate is constant, we say theres no jitter, but there remains a latency problem.

                    Basically, for a GPU:

                    Latency: The time it takes to create a frame
                    Jitter: The measure of the latency difference between two frames

                    You can have a very high latency with no jitter. You can also have a lot of jitter with very little latency (more noticable on 120Hz native displays).

                    And again, I stress Doom3 really shouldn't be showing any significant latency/jitter anyways, considering you could max the thing with a now aged 8800 GTX...

                    Comment

                    Working...
                    X