NVIDIA VDPAU Benchmarks

Written by Michael Larabel in Display Drivers on 14 November 2008 at 08:59 PM EST. Page 2 of 2. 53 Comments.

Below is the CPU usage while playing back the H.264 video twice using the GL2 video output module. The dip near the middle is when the first video had ended and mplayer was being launched again with the same arguments. The CPU usage towards the end is once the second video had ended.

Next is the CPU usage while using X-Video.

Finally, we have the CPU usage when using NVIDIA's brand new Video Decode and Presentation API for Unix.

Well, as you can see from the results, the CPU usage of the Intel Core 2 Duo E8400 running at 1.80GHz was dramatically lower when using VDPAU as opposed to GL2 and X-Video with the GeForce 9800GTX. There is clearly a night and day difference. Sadly since NVIDIA doesn't support XvMC on GPUs newer than the GeForce 7 series, we weren't able to provide any XvMC vs. VDPAU comparison benchmarks with MPEG-2 decoding. We are continuing our exploration of VDPAU and will report back with any other findings. Once AMD officially introduces X-Video Bitstream Acceleration it should make for an interesting comparison. You can share your thoughts on this set of video APIs in the Phoronix Forums.

If you enjoyed this article consider joining Phoronix Premium to view this site ad-free, multi-page articles on a single page, and other benefits. PayPal or Stripe tips are also graciously accepted. Thanks for your support.


Related Articles
About The Author
Michael Larabel

Michael Larabel is the principal author of Phoronix.com and founded the site in 2004 with a focus on enriching the Linux hardware experience. Michael has written more than 20,000 articles covering the state of Linux hardware support, Linux performance, graphics drivers, and other topics. Michael is also the lead developer of the Phoronix Test Suite, Phoromatic, and OpenBenchmarking.org automated benchmarking software. He can be followed via Twitter, LinkedIn, or contacted via MichaelLarabel.com.