Announcement

Collapse
No announcement yet.

Intel Gen12/Xe Graphics Have AV1 Accelerated Decode - Linux Support Lands

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by tildearrow View Post
    AMD! Hurry up!

    [...]

    - Lowest quality encoder in the market
    I just want to mention that while AMD's h.264 encoder is terrible, AMD's HEVC encoder is fantastic.

    Comment


    • #12
      Originally posted by bug77 View Post
      So... No H266 for at least a couple more years. Not surprised at all.
      The usual lead time between standards and first VPU (dedicate Video Processing Unit) silicon is typically around 18 months (sometimes a bit less, sometimes a bit more, depending on the details of the standard), although FPGA implementations typically happen much sooner. And it is more like three to four+ years before such decode units become ubiquitous in many shipping SoCs.

      Comment


      • #13
        Originally posted by lyamc View Post

        I just want to mention that while AMD's h.264 encoder is terrible, AMD's HEVC encoder is fantastic.
        What people seem to miss is that it seems AMD likely crippled the h264 silicon to make room for hevc. The fixed function hardware doesn't always overlap and just making the chips bigger and bigger to fit more and more codecs over time isn't practical. Its where part of that Intel / Nvidia premium goes.

        So in the same vein in ~5 years when new hardware has av1 encode and decode expect the vp9 and hevc support to diminish or drop.

        Comment


        • #14
          Originally posted by bug77 View Post
          So... No H266 for at least a couple more years. Not surprised at all.
          They can't add support for something that did not exist when the hardware was designed.

          Comment


          • #15
            Originally posted by Spacefish View Post
            AV1 decoding on the AMD GPU on APUs would definitely be a good thing for laptops / low power PCs... With Firefox slowly enabling VA-API on Linux and hopefully sometime later chromium will follow too, this could be a story for longer battery live by default, when watching videos in the browser on mobile devices.

            I don´t really see the benefit of having a dedicated Encoder part in your GPU. It eats away valuable chip area and is used really seldom. In reality, only Game Streamers probably use it. Video Conferencing software does CPU encoding 99% of the time. Creators typically use the CPU based encoders as they offer better compression/quality.
            I'm not a game streamer/creator, I do Skype/Zoom/Conference call, screen sharing/recording, send video on telegram/other app that would resample/compress video before sending, chopping recorded videos, etc.

            Comment


            • #16
              Cool! Will we see AV1 acceleration in RDNA 2 cards?

              Comment


              • #17
                Originally posted by clementhk View Post

                I'm not a game streamer/creator, I do Skype/Zoom/Conference call, screen sharing/recording, send video on telegram/other app that would resample/compress video before sending, chopping recorded videos, etc.
                None of those applications even taps into the GPU encoder/decoder. They use software based encoding and decoding. No many applications even take advantage of VAAPI.

                Comment


                • #18
                  Originally posted by LoveRPi View Post

                  None of those applications even taps into the GPU encoder/decoder. They use software based encoding and decoding. No many applications even take advantage of VAAPI.
                  Quite wrong, most of them use HW codecs when possible. There are options for that in the settings on by default if you bothered to check.
                  And if you do use a browser based solution, they sometimes do use HW codecs too (but not necessarily for WebRTC, it's a tricky situation).

                  Comment


                  • #19
                    Originally posted by LoveRPi View Post

                    None of those applications even taps into the GPU encoder/decoder. They use software based encoding and decoding. No many applications even take advantage of VAAPI.
                    Almost no software or services use AV1 now as well, are you saying that we don't need AV1 encoders/decoders then? If we expect royalty-free AV1 to be used everywhere, we need to expect it doesn't compromise our user experience and for now AV1 software encoders are not up to the task yet. I'm just replying to the comment that only game streamers need an efficient encoder.

                    Comment


                    • #20
                      Originally posted by Orphis View Post

                      Quite wrong, most of them use HW codecs when possible. There are options for that in the settings on by default if you bothered to check.
                      And if you do use a browser based solution, they sometimes do use HW codecs too (but not necessarily for WebRTC, it's a tricky situation).
                      Nope, you're definitely wrong. Name 5 applications.

                      Comment

                      Working...
                      X