Announcement

Collapse
No announcement yet.

OpenGL Frame Latency / Jitter Testing On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • OpenGL Frame Latency / Jitter Testing On Linux

    Phoronix: OpenGL Frame Latency / Jitter Testing On Linux

    Beyond there finally being Team Fortress 2 benchmarks on Linux, at Phoronix is now also support for OpenGL frame latency benchmarks! It's another much sought after feature and request for graphics hardware and driver testing...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Nice to have this. This, combined with capturing an APITrace, could really make performance tuning graphics drivers easier.

    Comment


    • #3
      I have questions about how this works

      On the windows side, we've been told it's impossible to capture this information reliably without a piece of dedicated hardware that NVidia recently released to some review sites. This seems to be completely in software, though.

      Is this going to run into the same issues that windows games had? Is it the same less accurate tests that came out on windows a while back, that can show some obvious problems but not the real details of what's going on? Or is OpenGL somehow different than D3D in this regard?

      Comment


      • #4
        Originally posted by smitty3268 View Post
        On the windows side, we've been told it's impossible to capture this information reliably without a piece of dedicated hardware that NVidia recently released to some review sites. This seems to be completely in software, though.

        Is this going to run into the same issues that windows games had? Is it the same less accurate tests that came out on windows a while back, that can show some obvious problems but not the real details of what's going on? Or is OpenGL somehow different than D3D in this regard?
        It's what's reported by the engine. Some references:



        Michael Larabel
        https://www.michaellarabel.com/

        Comment


        • #5
          awesome, finally a really good measure how choppy the game feels. Really cool work there, Michael

          Comment


          • #6
            Originally posted by Michael View Post
            Hmm, the documentation doesn't really seem to say anything about how it calculates that number. I guess i'll need to dive into the source code if i really want to know. Most likely it does the same thing the old windows tests did, and just adds a callback event before the frame is displayed.

            As such, i think we need to take these tests with a grain of salt, although they are still very useful to see.

            Comment


            • #7
              Originally posted by smitty3268 View Post
              Hmm, the documentation doesn't really seem to say anything about how it calculates that number. I guess i'll need to dive into the source code if i really want to know. Most likely it does the same thing the old windows tests did, and just adds a callback event before the frame is displayed.

              As such, i think we need to take these tests with a grain of salt, although they are still very useful to see.
              Why? If it just calls gettimeofday() after every draw call and presents the difference from last frame, then it is showing frame latency. Sure, double or triple buffering will smooth things out for the user, but if you have peaks at the level of of 30ms, there's no way you won't notice it while gaming. So it does show how smooth the game is, and that's most important.

              Comment


              • #8
                Originally posted by smitty3268 View Post
                On the windows side, we've been told it's impossible to capture this information reliably without a piece of dedicated hardware that NVidia recently released to some review sites. This seems to be completely in software, though.

                Is this going to run into the same issues that windows games had? Is it the same less accurate tests that came out on windows a while back, that can show some obvious problems but not the real details of what's going on? Or is OpenGL somehow different than D3D in this regard?
                although windows related this video was interesting

                Comment


                • #9
                  Originally posted by tomato View Post
                  Why? If it just calls gettimeofday() after every draw call and presents the difference from last frame, then it is showing frame latency. Sure, double or triple buffering will smooth things out for the user, but if you have peaks at the level of of 30ms, there's no way you won't notice it while gaming. So it does show how smooth the game is, and that's most important.
                  Because there are various queues, buffers, and schedulers that can distort things.

                  See http://www.anandtech.com/show/6857/a...-roadmap-fraps for a good overview of some of the issues.
                  Edit - here's another: http://www.anandtech.com/show/6862/f...rking-part-1/2

                  However, i will note, that much of the issue has to do with numbering frames precisely and detecting tearing, which is tough to do in an overlay but easy if you modify the actual game code. So it seems like that may be exactly what the Doom3 code does, so maybe that's not an issue after all, and it would only become one if Michael tried to make this a generic feature of PTS that applied to other applications that didn't directly support it.
                  Last edited by smitty3268; 18 July 2013, 07:39 PM.

                  Comment


                  • #10
                    Originally posted by smitty3268 View Post
                    However, i will note, that much of the issue has to do with numbering frames precisely and detecting tearing, which is tough to do in an overlay but easy if you modify the actual game code. So it seems like that may be exactly what the Doom3 code does, so maybe that's not an issue after all, and it would only become one if Michael tried to make this a generic feature of PTS that applied to other applications that didn't directly support it.
                    PTS only supports it if it can be queried from the actual application/game directly. Only when the actual data gets exposed to PTS is it then handled in a generic way after that.
                    Michael Larabel
                    https://www.michaellarabel.com/

                    Comment

                    Working...
                    X