Announcement

Collapse
No announcement yet.

Intel Gen12/Xe Graphics Have AV1 Accelerated Decode - Linux Support Lands

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Intel Gen12/Xe Graphics Have AV1 Accelerated Decode - Linux Support Lands

    Phoronix: Intel Gen12/Xe Graphics Have AV1 Accelerated Decode - Linux Support Lands

    On top of Intel Gen12/Xe Graphics bringing other media engine improvements and much better 3D graphics support, another exciting element of the next-generation Intel graphics is now confirmed: GPU-accelerated AV1 video decoding!..

    http://www.phoronix.com/scan.php?pag...1-Decode-Media

  • #2
    AMD! Hurry up!
    ​​​​​​Look! Intel is beating you! What if they eventually end up bringing AV1 encode too?! (on top of 4:4:4 and their high-quality encoder)
    ...yeah, while you are stuck in:

    - No 4:4:4
    - H.264 encode slower than HEVC
    - Lowest quality encoder in the market
    - No VP9 encode and not even decode (except latest chips)
    - Not even AV1 decode

    At the end I will be buying Xe if it turns out to fare well against my current AMD card...

    Comment


    • #3
      Traditional graphics companies have used dedicated hardware blocks for video codec encode/decode. I'd wager Intel is more likely to write a decoder that uses the EUs. Nothing stops AMD from implementing a similar encode/decode path for their Compute cores.
      AMD has always had problems with software quality and features. Though, it's a hard target for them with constantly launching new products (or rebadged older products). Look at how many GCN versions there are out there. At least Nvidia and Intel cleanly demarcated their product lines.

      Comment


      • #4
        So... No H266 for at least a couple more years. Not surprised at all.

        Comment


        • #5
          AV1 decoding on the AMD GPU on APUs would definitely be a good thing for laptops / low power PCs... With Firefox slowly enabling VA-API on Linux and hopefully sometime later chromium will follow too, this could be a story for longer battery live by default, when watching videos in the browser on mobile devices.

          I don´t really see the benefit of having a dedicated Encoder part in your GPU. It eats away valuable chip area and is used really seldom. In reality, only Game Streamers probably use it. Video Conferencing software does CPU encoding 99% of the time. Creators typically use the CPU based encoders as they offer better compression/quality.

          Comment


          • #6
            Originally posted by tildearrow View Post
            AMD! Hurry up! (...)​​
            Isn't this part of AMD's eternal problem of having worse software (but good or better hw) than its competition?

            Comment


            • #7
              Originally posted by cl333r View Post
              Isn't this part of AMD's eternal problem of having worse software (but good or better hw) than its competition?
              Their video capabilities unfortunately have been rather bad vs. competitors for a decade or so now. When Navi was released, there was some legitimate fuss about degraded h.264 encoding quality vs. even Polaris...

              Comment


              • #8
                Originally posted by aufkrawall View Post
                Their video capabilities unfortunately have been rather bad vs. competitors for a decade or so now. When Navi was released, there was some legitimate fuss about degraded h.264 encoding quality vs. even Polaris...
                Ughhhhh are you kidding me...........

                ...so they nerfed the H.264 encoder even further, instead of improving it. Ugh.
                This only makes me want to move to Intel or even half-baked NVIDIA *sighs*
                Is it THAT hard to improve SOMETHING?!

                Comment


                • #9
                  Originally posted by Spacefish View Post
                  In reality, only Game Streamers probably use it.
                  A notable userbase.

                  Comment


                  • #10
                    Originally posted by Ipkh View Post
                    Traditional graphics companies have used dedicated hardware blocks for video codec encode/decode.
                    Due to the complexities of an encoder, hardware based solutions are less and less useful for encoding. The encoding algorithmic flexibility is increasing by the day so a general purposes processor is actually the best for encoding. The only reason you would need a hardware encoder is for fast latency with low complexity (low quality) encoding selection.

                    Comment

                    Working...
                    X