Announcement

Collapse
No announcement yet.

OpenGL Frame Latency / Jitter Testing On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • smitty3268
    replied
    Originally posted by tomato View Post
    Why? If it just calls gettimeofday() after every draw call and presents the difference from last frame, then it is showing frame latency. Sure, double or triple buffering will smooth things out for the user, but if you have peaks at the level of of 30ms, there's no way you won't notice it while gaming. So it does show how smooth the game is, and that's most important.
    Because there are various queues, buffers, and schedulers that can distort things.

    See http://www.anandtech.com/show/6857/a...-roadmap-fraps for a good overview of some of the issues.
    Edit - here's another: http://www.anandtech.com/show/6862/f...rking-part-1/2

    However, i will note, that much of the issue has to do with numbering frames precisely and detecting tearing, which is tough to do in an overlay but easy if you modify the actual game code. So it seems like that may be exactly what the Doom3 code does, so maybe that's not an issue after all, and it would only become one if Michael tried to make this a generic feature of PTS that applied to other applications that didn't directly support it.
    Last edited by smitty3268; 18 July 2013, 07:39 PM.

    Leave a comment:


  • DDF420
    replied
    Originally posted by smitty3268 View Post
    On the windows side, we've been told it's impossible to capture this information reliably without a piece of dedicated hardware that NVidia recently released to some review sites. This seems to be completely in software, though.

    Is this going to run into the same issues that windows games had? Is it the same less accurate tests that came out on windows a while back, that can show some obvious problems but not the real details of what's going on? Or is OpenGL somehow different than D3D in this regard?
    although windows related this video was interesting

    Leave a comment:


  • tomato
    replied
    Originally posted by smitty3268 View Post
    Hmm, the documentation doesn't really seem to say anything about how it calculates that number. I guess i'll need to dive into the source code if i really want to know. Most likely it does the same thing the old windows tests did, and just adds a callback event before the frame is displayed.

    As such, i think we need to take these tests with a grain of salt, although they are still very useful to see.
    Why? If it just calls gettimeofday() after every draw call and presents the difference from last frame, then it is showing frame latency. Sure, double or triple buffering will smooth things out for the user, but if you have peaks at the level of of 30ms, there's no way you won't notice it while gaming. So it does show how smooth the game is, and that's most important.

    Leave a comment:


  • smitty3268
    replied
    Originally posted by Michael View Post
    Hmm, the documentation doesn't really seem to say anything about how it calculates that number. I guess i'll need to dive into the source code if i really want to know. Most likely it does the same thing the old windows tests did, and just adds a callback event before the frame is displayed.

    As such, i think we need to take these tests with a grain of salt, although they are still very useful to see.

    Leave a comment:


  • YoungManKlaus
    replied
    awesome, finally a really good measure how choppy the game feels. Really cool work there, Michael

    Leave a comment:


  • Michael
    replied
    Originally posted by smitty3268 View Post
    On the windows side, we've been told it's impossible to capture this information reliably without a piece of dedicated hardware that NVidia recently released to some review sites. This seems to be completely in software, though.

    Is this going to run into the same issues that windows games had? Is it the same less accurate tests that came out on windows a while back, that can show some obvious problems but not the real details of what's going on? Or is OpenGL somehow different than D3D in this regard?
    It's what's reported by the engine. Some references:



    Leave a comment:


  • smitty3268
    replied
    I have questions about how this works

    On the windows side, we've been told it's impossible to capture this information reliably without a piece of dedicated hardware that NVidia recently released to some review sites. This seems to be completely in software, though.

    Is this going to run into the same issues that windows games had? Is it the same less accurate tests that came out on windows a while back, that can show some obvious problems but not the real details of what's going on? Or is OpenGL somehow different than D3D in this regard?

    Leave a comment:


  • Veerappan
    replied
    Nice to have this. This, combined with capturing an APITrace, could really make performance tuning graphics drivers easier.

    Leave a comment:


  • phoronix
    started a topic OpenGL Frame Latency / Jitter Testing On Linux

    OpenGL Frame Latency / Jitter Testing On Linux

    Phoronix: OpenGL Frame Latency / Jitter Testing On Linux

    Beyond there finally being Team Fortress 2 benchmarks on Linux, at Phoronix is now also support for OpenGL frame latency benchmarks! It's another much sought after feature and request for graphics hardware and driver testing...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite
Working...
X