Announcement

Collapse
No announcement yet.

Going Beyond Just Measuring Frame Rates

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Going Beyond Just Measuring Frame Rates

    Phoronix: Going Beyond Just Measuring Frame Rates

    Yesterday marked the release of Phoronix Test Suite 2.2 and it was the best version yet with the addition of many new exciting and useful features. While this release was gratifying, there are much greater plans for the Phoronix Test Suite going into the next decade. It has already been shared that Windows support is coming, but there are other huge features coming too as soon as Q1'2010. Up to this point, most of the tests and the design of pts-core (the Phoronix Test Suite engine) have been focused on quantitative benchmarks with many of the tests spitting out a frame-rate, time, or some other measurement. However, now being supported in the Phoronix Test Suite is the ability to produce abstract results, such as screenshots used for image quality comparisons. The Phoronix Test Suite can now track the image quality of various test profiles (such as OpenGL games) across hardware configurations, drivers, and more. All of this is still leveraged upon the existing Phoronix Test Suite framework and our design philosophies so that even image quality comparisons can be carried out autonomously, the ability to compare many results side-by-side, support for carrying out these tests remotely via Phoromatic, and the ability to share your abstract results with others via Phoronix Global. Now not only can you be sure you are satisfied with the quantitative frame-rate of the hardware you have -- or are about to purchase -- but you have a plethora of options for looking at the qualitative performance too.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    pts tests latency?

    I'd like to see some audio/video latency benchmarks. I think this is where the most room for improvement exists on the linux desktop.

    One way to do audio latency testing is with a cable hooked from output to input. Play a sound, and record the delay. Of course, settings must be carefully set to prevent feedback.

    We can also test input lag. I noticed a while back a significant input lag with fglrx, which is not a fault of the usb stack, but the video driver. Nvidia blob never had this problem.

    One approach would be a device driver that mimics a keyboard and mouse and a special opengl app, say concentric rings of varying color? The device driver would move the input at a certain amount per second, The output could be sampled every frame and cursor position is checked with some image processing. Is this possible?

    Also, interactivity tests similar to what Con Kolivas created.

    Good benchmarks will move the important technologies forward.

    Comment


    • #3
      The article goes on about OpenGL games, which is great, but the other obvious use-case here is codec performance - eg. encode a video using different options on different codecs, and track encode time/CPU usage, resulting file size, and pixel difference from original

      Could be used for audio as well

      Comment


      • #4
        Originally posted by garytr24 View Post
        I'd like to see some audio/video latency benchmarks. I think this is where the most room for improvement exists on the linux desktop.
        i completely agree with garytr. we need to get some concrete quantifiable input lag benchmarks. i've noticed that quakelive is slightly less responsive on linux vs windows, and it would be great if there were some real data showing that difference.

        Comment

        Working...
        X