It's been a while since last running any video acceleration benchmarks at Phoronix so this week we're running a fresh set of VDPAU (Video Decode and Presentation API for Unix) benchmarks with NVIDIA's official driver plus for AMD Radeon hardware using the Radeon VDPAU state tracker support.
For this article a lower-end Intel CPU was installed (Core i3 4130) and a range of NVIDIA GeForce graphics cards were tested from this system running Ubuntu 14.04 64-bit with the Linux 3.13 kernel and using the NVIDIA 337.25 driver, which was the latest release at the time of testing. The graphics cards tested included the:
- Gigabyte NVIDIA GeForce 9500 GT 1024MB (550/400MHz)
- XFX NVIDIA GeForce 9600 GSO 512MB (500/900MHz)
- MSI NVIDIA GeForce 9800 GT 512MB (660/950MHz)
- XFX NVIDIA GeForce GT 220 1024MB (625/400MHz)
- ECS NVIDIA GeForce GT 240 512MB (550/1700MHz)
- Gigabyte NVIDIA GeForce GTX 460 768MB (675/1804MHz)
- eVGA NVIDIA GeForce GT 520 1024MB (810/500MHz)
- eVGA NVIDIA GeForce GTX 550 Ti 1024MB (951/2178MHz)
- Zotac NVIDIA GeForce GT 610 1024MB (810/533MHz)
- MSI NVIDIA GeForce GTX 650 1024MB (1084/2500MHz)
- NVIDIA GeForce GTX 680 2048MB (1006/3004MHz)
- eVGA NVIDIA GeForce GT 740 1024MB (1084/2500MHz)
- eVGA NVIDIA GeForce GTX 750 1024MB (1019/2505MHz)
- NVIDIA GeForce GTX 750 Ti 2048MB (1019/2700MHz)
- NVIDIA GeForce GTX 760 2048MB (980/3004MHz)
The video-cpu-usage test profile was used, which plays back Big Buck Bunny H.264 at 1080p. During this 1080p video playback process, the VDPAU support was utilized from MPlayer aside from one control run where X-Video was used instead. NVIDIA's Video Decode and Presentation API for Unix allows much of the video decoding process to be offloaded to the GPU and generally it works quite well and is widely supported among open-source multi-media applications.
During the Big Buck Bunny video playback process, the CPU usage was monitored by the test profile and we also monitored each graphics card's GPU temperature, GPU usage, and the overall AC system power draw (via a WattsUp power meter). The additional sensors can be polled automatically by the Phoronix Test Suite by setting the MONITOR=gpu.usage,gpu.temp,sys.power environment variable. This testing is quite straight forward and mainly intended for reference purposes for those thinking about a NVIDIA GPU for a Linux HTPC / multimedia PC, so let's get straight to the data.