Announcement

Collapse
No announcement yet.

AMD Lands AV1 Decode For Radeon RX 6000 Series In Mesa

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by agd5f View Post

    It was added to VA-API eventually, just not in time for when we needed it. VA-API is an Intel managed API.
    Thanks, that makes sense. Its pity that intel & nvidia are pulling the rug to their corner. Only AMD seems to care about standards.
    Can you disclose if Cezanne is going to support AV1 decode? I have committed to Renoir desktop and I am hoping can have AV1 with only CPU upgrade. Thanks.

    Comment


    • #42
      Originally posted by microcode View Post

      Well, I don't know that it's ironic exactly, but decoding AV1 is mostly doable on common CPUs, particularly the ones likely to be paired with a brand new discrete GPU, whereas encoding in realtime is essentially off the table right now for most hardware.
      For 1080@30 fps, sure. For 1080p@60 fps mostly sure (as long as you have AVX2 you'll have no problems with it). But something like 4k@60 fps can be a real challenge even for current top CPUs. Stil not THAT big of a problem for powerful desktop, but it's crucial for laptops.

      Also, HW decoding means possible support for DRM which is crucial for streaming in HD.

      Comment


      • #43
        So 4K@60 resolution is supported by RDNA 2 but what about HDR? Is 10 bit per pixel also supported or only 8 bit? Does anybody know? I asked Google but Google couldn't help me.

        Comment


        • #44
          Originally posted by dc_coder_84 View Post
          So 4K@60 resolution is supported by RDNA 2 but what about HDR? Is 10 bit per pixel also supported or only 8 bit? Does anybody know? I asked Google but Google couldn't help me.
          Well, the main AV1 profile dictates that you need to support both 8-bit and 10-bit operation to be classified as a true AV1 HW decoder, so no worries.

          Comment

          Working...
          X