Announcement

Collapse
No announcement yet.

Frame latency analysis on Doom 3

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Well think about a game scene, why would it be possible that you can record jitter? timedemo plays the same content just in the max possible speed, when you disable vsync in the video driver (nv 300+ has vsync active by default) then you get higher rates than 60 fps. But when it is played at normal speed then he say it has jitter - not that i laugh. Even when you record the movement (something like that is in the demo) with less than 60 hz then it can only be a bit jumpy but never jitter as the same content can not be played with less than 60 fps when it would have got that speed. He is just weird and sees everywhere jitter where nothing is. It is definitely NOT the question if a frame takes too long to render.

    Comment


    • #12
      If you look at the graphs from page 1, you see that there are frames which take more than 100ms to render. This is unwanted behaviour and corresponds to a lowly 10 FPS or less (!). All standard benchmark software masks this behaviour by the fact that the frames per second metric averages out those long frames. I suppose we agree on the fact that this method adds a lot of useful information about how smooth a game runs.

      How would you call this new type of analysis then?

      Comment


      • #13
        Basically it is nothing "new", lots of benchmark tests (on win) show the fps drawn on a time graph. Also you should not compare your system against his, because the gfx card is definitely different. When the card is fast (and only limited by driver) then other aspects like asset load time you have to take into account. Didn't you remember his remarks that load time could be tuned by a special kernel config - all that is more or less complete bullshit. Usually a 2nd run does not show these peaks when everything is pre-cached. It's also the same person who would buy a 2.x ghz quad core for dual socket 2011 instead of a 3.5 ghz quad with turbo for gaming - just to fix jitter - do you call that normal?

        Comment


        • #14
          I do not see why this discussion should be about Paradox Uncreated. This thread is about measuring hiccups/judder/jitter (or how you'd call it) in doom3. My point is that this could be useful. I posted the comparison of my system with Paradox's to demonstrate the possibilities of this method, no more, no less.

          I would rather have a system which can always churn out 30 fps minimum, instead of a system which does 60 fps average, but with hiccups of 500 ms or larger.

          Comment


          • #15
            It depends how you measure your 60 fps. If you measure it with vsync enabled in the gfx driver then it is unlikely that you will notice many drops while you play the game (ok, when it loads new game data, faster from ssd btw). If you get 60 fps average while the gfx driver (and the game setting) does not restrict the rendering then it more likely that there are many parts below 60 fps (that's the case with my gt630 @ 1920x1200). The switches in framerate you can see thats clear - but then you just need a faster gfx card usually to get rid of em. D3 is from 2004, so all cpus should be fast enough. As the code is opensource you can limit the HZ to 30 or 120 or whatever you like in neo/framework/UsercmdGen.h.

            Comment


            • #16
              Originally posted by Paradox Uncreated View Post
              First of all, a kernel with "low latency desktop" enabled. You can edit Kconfig.hz and change 100 to 90. (search/replace).
              Do you have an explanation for why a lower Hz setting improves jitter? I thought the general wisdom was that a higher setting would improve timing precision.

              Comment


              • #17
                Well maybe you should reboot your system and try 2 passes or 3 of the demo and compare. Be prepared that your thesis is wrong.

                Comment


                • #18
                  Originally posted by Paradox Uncreated View Post
                  Less interrupt triggered = less interruptions = less jitter. Depending on how things is done, if less interrupts means larger data bursts, there may be a sweet spot. I have found 90 hz, to give the least jitter, with 5hz accuracy.
                  It would be interesting to test rational multiples (e.g. 2, 3, 5, 3/2, 4/3) of the monitor refresh rate for Hz (e.g. 60, 120, 180, 300, 90, 80).

                  Also, if Phoronix Test Suite adds frame time graphs, it would be helpful to show the X axis as time for demos that run at a constant speed (i.e. not like Quake 3 timedemos). That way it's easier to tell if a long frame always happens in the same place in a demo.

                  Comment


                  • #19
                    Please keep it on-topic. The subject is about the TechReport article and how this could be applied to Linux systems testing.

                    Comment


                    • #20
                      Originally posted by unix_epoch View Post
                      Also, if Phoronix Test Suite adds frame time graphs, it would be helpful to show the X axis as time for demos that run at a constant speed (i.e. not like Quake 3 timedemos). That way it's easier to tell if a long frame always happens in the same place in a demo.
                      What would then be measured in the y-range?

                      I believe that Doom 3 timedemos are frame-for-frame identical across machines, because delays happen always in the same frame. Moreover, timedemos always have the same frame lenght.

                      Comment

                      Working...
                      X