Originally posted by agd5f
View Post
AMD Open-Sources VCE Video Encode Engine Code
Collapse
X
-
Originally posted by Figueiredo View PostChristian, could you elaborate a little abou the flexibility of these blocks (UVD and VCE)? For example, is VCE only usefull for H.264? Could it be eventually used for other codecs, even if only partially?
Currently we only expose the "normal" 4:2:0 YUV to H264 encoding process. But you can for example aid encoding by calculating the best motion vectors with shaders (or the CPU or get them from the source video while transcoding etc..). In general it's quite flexible regarding which part of encoding it should do and could even only do things like bitstream encoding and the rest elsewhere.
Originally posted by Figueiredo View PostWhat about true audio? Is there any plant to support it on the open driver?
Christian.
Comment
-
-
Originally posted by agd5f View PostWe are already taking advantage of it.
So both hardware design changes and better software processes are helping allot in getting things open sourced.
Comment
-
-
Originally posted by Deathsimple View PostIndeed, while releasing UVD was quite painfull and took us years to complete getting VCE out of the door was rather simple.
So both hardware design changes and better software processes are helping allot in getting things open sourced.
Comment
-
-
Originally posted by Deathsimple View PostI don't even know if that thing is part of the GFX engine or part of the audio codec. So no idea, sorry.
BTW, now that there's finally gallium video encoding, will the vaapi state tracker be resurrected? Much more sw supports vaapi than omx.
Comment
-
-
Originally posted by curaga View PostBTW, now that there's finally gallium video encoding, will the vaapi state tracker be resurrected? Much more sw supports vaapi than omx.
Comment
-
-
Originally posted by Deathsimple View PostVCE is a completely separate block from UVD (only marketing sometimes sells them as one). As far as I know it only works with H264 and this is a hardware limitation.
Currently we only expose the "normal" 4:2:0 YUV to H264 encoding process. But you can for example aid encoding by calculating the best motion vectors with shaders (or the CPU or get them from the source video while transcoding etc..). In general it's quite flexible regarding which part of encoding it should do and could even only do things like bitstream encoding and the rest elsewhere.
Comment
-
-
Originally posted by agd5f View PostYou can use VCE to encode to your own h.264/mpeg videos or in combination with UVD to transcode videos.
I just wonder if it's possible to encode to h264 right from framebuffer?
E.g like Steam In-Home streaming implemented or Nvidia feature called ShadowPlay.Last edited by _SXX_; 04 February 2014, 06:10 PM.
Comment
-
Comment