Announcement

Collapse
No announcement yet.

Frame latency analysis on Doom 3

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • thofke
    started a topic Frame latency analysis on Doom 3

    Frame latency analysis on Doom 3

    As some of you are aware, The Tech Report have been using a new type of analysis on FPS to analyse frame latencies (see http://techreport.com/review/21516/i...e-benchmarking).

    To do this yourself I have devised a methodology (which could be automated) using Doom 3.

    Step 1:
    Record frame timings (I have used the command from the OpenBenchmarking Doom3 Test Profile). The important bit is the com_speeds flag:

    Code:
    ./doom3 +exec doom3-pts.cfg +set sys_VideoRam 512MB +set r_mode -1 +timedemoquit demo1 +set com_speeds 1 > doom3.log
    Step 2:
    Parse log file using this Python script:
    Code:
    import re
    
    # Compile reg exp
    re1='.*?'	    # Non-greedy match on filler
    re2='(\\d+)'	# Integer Number 1
    r = re.compile(re1+re2+re1+re2+re1+re2+re1+re2+re1+re2,re.IGNORECASE|re.DOTALL)
    
    # Open log file
    frames = []
    f = open('doom3.log')
    for line in f.readlines():
        if line.startswith('frame:'):
            m = r.search(line)
            if m:
                frames.append( {'com_frameNumber': int(m.group(1)),
                                'com_frameMsec':   int(m.group(2)), 
                                'time_gameFrame':  int(m.group(3)), 
                                'time_frontend':   int(m.group(4)), 
                                'time_backend':    int(m.group(5))} )
    f.close()
    
    # Write to data file
    f = open('data.csv', 'w')
    f.write('Frame,FrameTime\n')
    for frame in frames:
        # Skip first frame
        if frame['com_frameNumber'] == 0:
            continue
        f.write('%i,%i\n' % (frame['com_frameNumber'], frame['com_frameMsec']))
    f.close()
    Step 3:
    Use this R script on resulting csv file:
    Code:
    dat <- read.csv("data.csv",header=T,colClasses=c("integer","numeric"))
    summary(dat)
    
    # Progression of frame times
    plot(dat, type='n', xlab="Frame number", 
        ylab="Frame time in ms (lower is better)")
    lines(dat)
    
    # Average fps
    max(dat$Frame)*1000/sum(dat$FrameTime)
    
    # Percentiles
    # It's the point below which 99% of all frames have been rendered. 
    # We're simply excluding the last 1% of frames, many of them potential outliers, 
    # to get a sense of overall smoothness. 
    quantile(dat$FrameTime, .99) 
    
    # Frame latencies by percentile
    p <- seq(0, 1, length.out=1000)^(1/3)
    quan <- data.frame(q = quantile(dat$FrameTime, probs = p), prob = p)
    plot(quan$prob, quan$q, type='n', xaxt='n', xlim=c(0.5,1),
      xlab="Proportion of frames rendered", ylab="Frame time in ms (lower is better)")
    axis(1, at=seq(0,1,by=.05), labels=paste(100*seq(0,1,by=.05), "%") )
    lines(quan$prob, quan$q)
    
    # Time spent beyond 50 ms
    sum(subset(dat,FrameTime>50)$FrameTime)
    
    # Time spent beyond 16.7 ms
    sum(subset(dat,FrameTime>16.7)$FrameTime)

    Example of resulting images:



  • gamerk2
    replied
    Originally posted by unix_epoch View Post
    It's obvious that deficiencies in the OS scheduler can cause just as much jitter as a hiccup in the GPU, but otherwise what gamer2k is saying makes sense to me.

    Can you define what you mean by jitter? I would define jitter as any time-varying variation between when a frame is expected to be displayed and when it is actually displayed (so, for example, a constant 33ms delay would not be jitter, but a delay that fluctuates between 0ms and 33ms would be jitter).
    Thats more or less correct. Granted, a constant 33ms latency wouldn't exactly be smooth either (a frame would be created one cycle, repeated the next as the next frame isn't ready, then the next one displayed on the third cycle), but because the rate is constant, we say theres no jitter, but there remains a latency problem.

    Basically, for a GPU:

    Latency: The time it takes to create a frame
    Jitter: The measure of the latency difference between two frames

    You can have a very high latency with no jitter. You can also have a lot of jitter with very little latency (more noticable on 120Hz native displays).

    And again, I stress Doom3 really shouldn't be showing any significant latency/jitter anyways, considering you could max the thing with a now aged 8800 GTX...

    Leave a comment:


  • gens
    replied
    Originally posted by Paradox Uncreated View Post
    I think I have sufficiently solved jitter now though. Doom 3 jitter is even lower with renice (-20) + my listed tweaks. So I feel there is little to improve. It is gliding silky smooth now. No frameloss, and timing jitter is so low, that I think it should be near impossible to see. So for my part, I don`t need any numbers, and they would need to be more verbose, than the option in doom 3 anyway. But try it. You wil see a big difference, and very enjoyable smooth frames. The trick with renice can also be used with webbrowser, to have less jitter on youtube videos, etc. Where I also recommend chromium, because it has the lowest jitter to begin with.

    Case solved!

    Peace Be With You.
    good, now you can stop posting nonsense

    Leave a comment:


  • unix_epoch
    replied
    Originally posted by Paradox Uncreated View Post
    You are obviously nuts. I guess I am just going to have to get used to all the nutters on the internutz.
    It's obvious that deficiencies in the OS scheduler can cause just as much jitter as a hiccup in the GPU, but otherwise what gamer2k is saying makes sense to me.

    Can you define what you mean by jitter? I would define jitter as any time-varying variation between when a frame is expected to be displayed and when it is actually displayed (so, for example, a constant 33ms delay would not be jitter, but a delay that fluctuates between 0ms and 33ms would be jitter).

    Leave a comment:


  • unix_epoch
    replied
    Originally posted by thofke View Post
    What would then be measured in the y-range?
    The Y axis would still be frame duration. You can plot the same exact data by using the timestamp of each frame instead of the frame number as the X coordinate for each data point.

    I believe that Doom 3 timedemos are frame-for-frame identical across machines, because delays happen always in the same frame. Moreover, timedemos always have the same frame lenght.
    Timedemos wouldn't need any change, but there are graphs on the linked Techreport article that show vastly different frame counts. Looking carefully on page 2 you can see the same pattern of spikes in different places on all four Radeon GPUs:



    Using the timestamp instead of frame number for the X coordinate would make spikes caused by game content line up, while spikes caused by the process getting interrupted would not line up.

    Leave a comment:


  • Paradox Ethereal
    replied
    PS:

    I think I have sufficiently solved jitter now though. Doom 3 jitter is even lower with renice (-20) + my listed tweaks. So I feel there is little to improve. It is gliding silky smooth now. No frameloss, and timing jitter is so low, that I think it should be near impossible to see. So for my part, I don`t need any numbers, and they would need to be more verbose, than the option in doom 3 anyway. But try it. You wil see a big difference, and very enjoyable smooth frames. The trick with renice can also be used with webbrowser, to have less jitter on youtube videos, etc. Where I also recommend chromium, because it has the lowest jitter to begin with.

    Case solved!

    Peace Be With You.

    Leave a comment:


  • Paradox Ethereal
    replied
    Originally posted by gamerk2 View Post
    You continue to make the silly assumption that all forms of jitter are OS/kernel related. Games are more likely to suffer jitter due to H/W effects, rather then S/W.
    You are obviously nuts. I guess I am just going to have to get used to all the nutters on the internutz.

    Leave a comment:


  • gamerk2
    replied
    Originally posted by Paradox Uncreated View Post
    G(l)aymer2k chimes in and shows complete lack of understanding, and the incoherence of Guano. "That assumes.." He didn`t understand jitter in the other thread either. It`s ridicolous, it is a joke. Wherever these people work, avoid them like the pest.

    And then to go on to "frame latency" and "slow time periods". Never use these people as translators, to put it like that, because obviously it turns to shit in there.
    You continue to make the silly assumption that all forms of jitter are OS/kernel related. Games are more likely to suffer jitter due to H/W effects, rather then S/W.

    Leave a comment:


  • Paradox Ethereal
    replied
    Originally posted by gamerk2 View Post
    That assumes the CPU/GPU can keep up, which in modern games they typically can't. Doom 3 IS a decade old after all; really no reason frame latency shouldn't be much above 10ms or so...[would be interesting to run a Windows comparison...]

    http://techreport.com/review/21516/i...e-benchmarking

    Frame latency is a better benchmarking tool then FPS, because FPS averages out the slow time periods, and minimum FPS can catch outliers while hiding the latency problem. "Microstutter" on multi-GPU conflgs, for instance, is QUITE noticeable, even as FPS reaches into the hundreds.
    G(l)aymer2k chimes in and shows complete lack of understanding, and the incoherence of Guano. "That assumes.." He didn`t understand jitter in the other thread either. It`s ridicolous, it is a joke. Wherever these people work, avoid them like the pest.

    And then to go on to "frame latency" and "slow time periods". Never use these people as translators, to put it like that, because obviously it turns to shit in there.

    Leave a comment:


  • gamerk2
    replied
    Originally posted by Paradox Uncreated View Post
    Ofcourse it could be possible to do vsynced HZ, and cleverly arranging things, so that each vsync, buffer is delivered, and next frame calculated. One for the kernel-engineers. (Who have time.)
    That assumes the CPU/GPU can keep up, which in modern games they typically can't. Doom 3 IS a decade old after all; really no reason frame latency shouldn't be much above 10ms or so...[would be interesting to run a Windows comparison...]

    http://techreport.com/review/21516/i...e-benchmarking

    Frame latency is a better benchmarking tool then FPS, because FPS averages out the slow time periods, and minimum FPS can catch outliers while hiding the latency problem. "Microstutter" on multi-GPU conflgs, for instance, is QUITE noticeable, even as FPS reaches into the hundreds.

    Leave a comment:

Working...
X