NVIDIA VDPAU Performance Metrics On Ubuntu 14.04 Linux
It's been a while since last running any video acceleration benchmarks at Phoronix so this week we're running a fresh set of VDPAU (Video Decode and Presentation API for Unix) benchmarks with NVIDIA's official driver plus for AMD Radeon hardware using the Radeon VDPAU state tracker support.
For this article a lower-end Intel CPU was installed (Core i3 4130) and a range of NVIDIA GeForce graphics cards were tested from this system running Ubuntu 14.04 64-bit with the Linux 3.13 kernel and using the NVIDIA 337.25 driver, which was the latest release at the time of testing. The graphics cards tested included the:
- Gigabyte NVIDIA GeForce 9500 GT 1024MB (550/400MHz)
The video-cpu-usage test profile was used, which plays back Big Buck Bunny H.264 at 1080p. During this 1080p video playback process, the VDPAU support was utilized from MPlayer aside from one control run where X-Video was used instead. NVIDIA's Video Decode and Presentation API for Unix allows much of the video decoding process to be offloaded to the GPU and generally it works quite well and is widely supported among open-source multi-media applications.
During the Big Buck Bunny video playback process, the CPU usage was monitored by the test profile and we also monitored each graphics card's GPU temperature, GPU usage, and the overall AC system power draw (via a WattsUp power meter). The additional sensors can be polled automatically by the Phoronix Test Suite by setting the MONITOR=gpu.usage,gpu.temp,sys.power environment variable. This testing is quite straight forward and mainly intended for reference purposes for those thinking about a NVIDIA GPU for a Linux HTPC / multimedia PC, so let's get straight to the data.
Latest Linux Hardware Reviews
Latest Linux Articles
Latest Linux News
Latest Forum Discussions