When it comes to video decoding performance you generally want a realistic "worst case" clip. Low bit rate clips do not stress the system enough to provide any meaningful data. When I'm shopping to purchase a truck for example, I don't see if it can handle a 25 pound bag of spuds if I possibly will be using it to haul around a camper.And if we create such clip - from big buck bunny lower bitrate h264 as source or multiple big buck bunny screens rendered from one lower bitrate formats into new file especially for phoronix test? It won't be commercial encode, the source is also free - should be no problem. Of course it will require some location to be buffered on the internet.
What do you think?
I should also note that for nvidia at least, even though it only officially supports one stream it seems that it is purely based on if the decoder has enough resources free to accommodate the additional load up to it's max capabilities.
My proposal was to combine four videostreams of the free licensed movie into one videostream using demux, decode, quality resize, picture-in-picture filters with final 16Mbit 2 pass compression.
The other possibility is to generate huge amount of raw noise in fullhd frame and encode it into any preferable format, setting encoder two pass average bitrate @ desireable.
a 40 mbit peak file would be much better, average maybe 30 mbit for video.
OC the 720p60 tall ships clip on the free x264 Blu-ray and AVCHD ISO is also a nice test
perhaps Michael should also house that in the Phoronix Test Suite tree along side other real life video clips i linked to OC
25th April 2010, 04:46
read the tread for more info and links/thoughts for higher bitrate etc.
I've been an AMD/ATi fanboy for a long time. I really liked where they where heading in the opensource department too. That said ...
wow, this does look like a really nice and open solution for a mediacenter/htpc. Being able to decode 1080p in hardware with opensource drivers is a big plus for me.
I do have a question though, maybe slightly off-topic. AMD doesn't opensource this bit (yet) because the whole DRM debacle. Now AMD (and nVidia?) say they have specific asic's in their die's for this? UVD, purevideo. Does the intel video chip have this aswell? Or, like I always thought, is UVD, purevideo just some extra functions and does the driver/chip still do most of the heavy lifting via shaders?