Announcement

Collapse
No announcement yet.

VA-API Library 2.14 Released With AV1 Encode Interface

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by cl333r View Post
    So then I wonder how much slower is it to encode AV1 on the CPU than HEVC?
    My 6 cores Coffee Lake encodes 1080p 10bit HEVC at 5-20 FPS.
    well, the fast answer is... it's not for some people. SVT-AV1 encodes on cpu faster than HEVC. it's worth trying, AOMENC is a much more efficient coder, but it's not as zoomy. the issue right now is you need to do some work yourself since ffmpeg preset's aren't exactly great.

    so while IMO the speed is there. ffmpeg isn't yet -_-

    Comment


    • #12
      Originally posted by Quackdoc View Post
      so while IMO the speed is there. ffmpeg isn't yet -_-
      ffmpeg -i myvid.mp4 -c:v libsvtav1 -b:v 0 -preset 4 myvid.webm
      gives you a pretty good encodespeed / quality tradeoff.. I encode 1080p @30fps at ~realtime on my Zen2 based 3700X with -preset 4.
      Lower presets give you better quality higher preset give you more speed / less quality.

      AV1 is a pretty amazing AV1 encoder with a wide range of encoding profiles / quality vs. performance knobs.

      Think the Intel AV1 Hardware Encoding is mainly targeted to VOD portals encoding videos "in the cloud".

      For video conferencing we probably need AV1 encoding in hardware on the iGPUs, as most notebooks, where video conferencing is mainly used, donĀ“t ship with a dedicated GPU.

      Comment


      • #13
        All the next gen cards will have AV1 encode with the exception of a few low end AMD cards. What kind of AV1 encode is the question now. It looks like the first round of hardware encoders will be mostly 8 bit but we need to wait for the official announcement to know for sure.

        Comment


        • #14
          Originally posted by Spacefish View Post



          gives you a pretty good encodespeed / quality tradeoff.. I encode 1080p @30fps at ~realtime on my Zen2 based 3700X with -preset 4.
          Lower presets give you better quality higher preset give you more speed / less quality.

          AV1 is a pretty amazing AV1 encoder with a wide range of encoding profiles / quality vs. performance knobs.

          Think the Intel AV1 Hardware Encoding is mainly targeted to VOD portals encoding videos "in the cloud".

          For video conferencing we probably need AV1 encoding in hardware on the iGPUs, as most notebooks, where video conferencing is mainly used, donĀ“t ship with a dedicated GPU.
          I find ffmpeg's encoder to give mediocre preformance vs piping it into svtav1enc and doing some manual tweaks, but yeah, SVT-AV1 is getting to be a really good encoder despite that, even with ffmpeg it's acceptable, (but not as good as hevc for me personally), and AOMENC has been for a long time, especially when paired with rav1e. which makes up for a lot of downsides from aomenc.

          Comment


          • #15
            Originally posted by MadeUpName View Post
            All the next gen cards will have AV1 encode with the exception of a few low end AMD cards. What kind of AV1 encode is the question now. It looks like the first round of hardware encoders will be mostly 8 bit but we need to wait for the official announcement to know for sure.
            not even 10b encode T.T

            Comment


            • #16
              Originally posted by MadeUpName View Post
              All the next gen cards will have AV1 encode with the exception of a few low end AMD cards. What kind of AV1 encode is the question now. It looks like the first round of hardware encoders will be mostly 8 bit but we need to wait for the official announcement to know for sure.
              ... Where are you getting that from?


              There's basically no reason to encode 8 bit AV1. Devs thought about removing that option entirely, and I can't even remember why it ended up being left in. Maybe its too late to remove it?

              Comment


              • #17
                Originally posted by brucethemoose View Post

                ... Where are you getting that from?


                There's basically no reason to encode 8 bit AV1. Devs thought about removing that option entirely, and I can't even remember why it ended up being left in. Maybe its too late to remove it?
                Various leak sites and some things said by different company spokes people. The main goal of putting AV1 encode in at this point isn't to provide great trans-coding of movies. It is to make almost every device able to do low bit rate video encoding for things like video conferencing.

                Comment


                • #18
                  Originally posted by MadeUpName View Post

                  Various leak sites and some things said by different company spokes people. The main goal of putting AV1 encode in at this point isn't to provide great trans-coding of movies. It is to make almost every device able to do low bit rate video encoding for things like video conferencing.
                  Yeah, but is 8 bit really that much easier to implement in an ASIC? And wouldn't it be easier to *just* implement 10 bit encoding, and not 8?


                  Its certainly not advantageous on CPUs, as you're always better off dropping to a lower preset than dropping to 8 bit afaik. I don't think memory usage is even that different.


                  This is not HEVC, where some devices are only capable of decoding 8 bit streams, hence you really need the 8 bit capability there as a baseline.
                  Last edited by brucethemoose; 26 February 2022, 02:17 PM.

                  Comment


                  • #19
                    Will I be able to decode AV1 on YouTube using this library?

                    Comment


                    • #20
                      Originally posted by Quackdoc View Post
                      highly doubt we will see av1 encode in consumer gpus for a while, decode however I can see new intel gpus having it
                      What's your source on that? Isn't the current rumor that NVIDIA will launch an AV1 encoder in their next consumer GPU line this year?

                      Comment

                      Working...
                      X