Adobe's Flash Video Acceleration For Linux Works Well

Written by Michael Larabel in Software on 2 December 2010 at 04:00 AM EST. Page 2 of 3. 38 Comments.

As you can see immediately, the Intel Atom CPU usage was significantly lower while playing back this 1080p H.264 video trailer on YouTube when using the Flash Player 10.2 Beta that leverages NVIDIA's Video Decode and Presentation API for Unix. The average CPU usage was 50% for the entire time slot recorded (we let the recording go on for four minutes which gave the system some time at the end to zero out to look at its recovery) or about 80% when the actual video playback was happening. Under this new Flash beta for Linux, the average CPU usage was 9.5% and it only spiked above that at four different times during the course of the testing. Playing back this video full-screen on the 10.2 beta was seamless and worked out extremely well.

With Flash 10.1 the GeForce 9400M ION was able to down-clock itself with its PowerMizer technology since the GPU really is not utilized that much in the older Flash releases. However, with VDPAU coming into play, the GPU here was not able to down-clock itself during the video playback process.

As the NVIDIA GPU is constantly clocked higher with there being video decoding work to handle, the GPU temperature will run a bit warmer with this new Adobe Flash release, but it is not too much hotter.


Related Articles