Announcement

Collapse
No announcement yet.

AMD Open-Sources VCE Video Encode Engine Code

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by agd5f View Post
    We are already taking advantage of it.
    Thanks, that's great to hear.

    Comment


    • #22
      Originally posted by Figueiredo View Post
      Christian, could you elaborate a little abou the flexibility of these blocks (UVD and VCE)? For example, is VCE only usefull for H.264? Could it be eventually used for other codecs, even if only partially?
      VCE is a completely separate block from UVD (only marketing sometimes sells them as one). As far as I know it only works with H264 and this is a hardware limitation.

      Currently we only expose the "normal" 4:2:0 YUV to H264 encoding process. But you can for example aid encoding by calculating the best motion vectors with shaders (or the CPU or get them from the source video while transcoding etc..). In general it's quite flexible regarding which part of encoding it should do and could even only do things like bitstream encoding and the rest elsewhere.

      Originally posted by Figueiredo View Post
      What about true audio? Is there any plant to support it on the open driver?
      I don't even know if that thing is part of the GFX engine or part of the audio codec. So no idea, sorry.

      Christian.

      Comment


      • #23
        Originally posted by oleid View Post
        What software supports encoding via vaapi? I didn't find any so far...
        gstreamer have vaapi plugin. so it can be easily integrated to any SW which use gstreamer.

        Comment


        • #24
          Originally posted by agd5f View Post
          We are already taking advantage of it.
          Indeed, while releasing UVD was quite painfull and took us years to complete getting VCE out of the door was rather simple.

          So both hardware design changes and better software processes are helping allot in getting things open sourced.

          Comment


          • #25
            Originally posted by Deathsimple View Post
            Indeed, while releasing UVD was quite painfull and took us years to complete getting VCE out of the door was rather simple.

            So both hardware design changes and better software processes are helping allot in getting things open sourced.
            So I suppose VCE1 is more likely than UVD2.0 (RS880) and way more likely than UVD1?

            Comment


            • #26
              Sounds great. Does the OpenMAX State Tracker support the h264 10bit profiles for decoding? Is the VDPAU API now patched for the 10bit Profiles?

              Comment


              • #27
                Originally posted by Deathsimple View Post
                I don't even know if that thing is part of the GFX engine or part of the audio codec. So no idea, sorry.
                SemiAccurate claims it's a separate DSP. So kind of like the programmable Sound Blaster cards.

                BTW, now that there's finally gallium video encoding, will the vaapi state tracker be resurrected? Much more sw supports vaapi than omx.

                Comment


                • #28
                  Originally posted by curaga View Post
                  BTW, now that there's finally gallium video encoding, will the vaapi state tracker be resurrected? Much more sw supports vaapi than omx.
                  Is there some specific app you have in mind? There doesn't seem to much that supports hw encoding in general on Linux (either API). Most people use gstreamer for encoding which supports both omx and vaapi. Additionally omx has some nice features for implementing more efficient transcoding.

                  Comment


                  • #29
                    Originally posted by Deathsimple View Post
                    VCE is a completely separate block from UVD (only marketing sometimes sells them as one). As far as I know it only works with H264 and this is a hardware limitation.

                    Currently we only expose the "normal" 4:2:0 YUV to H264 encoding process. But you can for example aid encoding by calculating the best motion vectors with shaders (or the CPU or get them from the source video while transcoding etc..). In general it's quite flexible regarding which part of encoding it should do and could even only do things like bitstream encoding and the rest elsewhere.
                    But if it's so flexible as to letting you choose which parts of the encoding to do with VCE and which to do by other means, you should be able to use the chip for encoding in other formats. Some parts are common between virtually all formats, IIRC.

                    Comment


                    • #30
                      Originally posted by agd5f View Post
                      You can use VCE to encode to your own h.264/mpeg videos or in combination with UVD to transcode videos.
                      But can this be technically used for game/screen streaming?
                      I just wonder if it's possible to encode to h264 right from framebuffer?

                      E.g like Steam In-Home streaming implemented or Nvidia feature called ShadowPlay.
                      Last edited by _SXX_; 02-04-2014, 05:10 PM.

                      Comment

                      Working...
                      X