If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
Last time I checked, the documentation released by AMD lacked any info for the video decoder.
On the other paw developers seem to think we don't need anything more. (at least for some level of video decoding acceleration) They'd be doing it with shaders. Someone just has to write it in.
Edit: Never mind, didn't read until the end. Apparently bridgman did say this in the other thread.
Well, one question remains: how much of the usual (H.264/VC-1) video decoding pipe can be sensibly accelerated with shaders?
I suspect it's safe to assume enough that it shows. Even MC (post-processing which is also afaik part of the pipeline) already drops CPU usage quite a bit, doing decoding with shaders should help even more. Numbers available when someone writes it in.
I believe that paper lumped bitstream decode in with reverse entropy.
You generally want to pick a point in the pipe and accelerate everything after that, in order to avoid having to push data back and forth between CPU and GPU. Since all of the subsequent steps (scaling, colour space conversion, post-filtering, de-interlacing) are usually done on GPU anyways this all works nicely.
I'm very dissappointed that there is no support for vdpau (or similar) for your gpu's in f.eks XBMC. I just bought a ATI HD card, but it now turns out to be a was a waste of money. I can't play h264 files with any hardware support from the gpu. Fortunately the card was cheap. I have always gone for ATI cards, but in the future will go for Nvidia if you don't go for open source code and let developers take advantage of the potential in the gpu.
DEVREL (10/01/2009 1:16 PM)
No plans to support it now or in the foreseeable future as there was no interest from selected ISV’s working on such projects.