Announcement

Collapse
No announcement yet.

Intel Gen12/Xe Graphics Have AV1 Accelerated Decode - Linux Support Lands

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by tildearrow View Post
    AMD! Hurry up!
    ​​​​​​Look! Intel is beating you! What if they eventually end up bringing AV1 encode too?! (on top of 4:4:4 and their high-quality encoder)
    ...yeah, while you are stuck in:

    - No 4:4:4
    - H.264 encode slower than HEVC
    - Lowest quality encoder in the market
    - No VP9 encode and not even decode (except latest chips)
    - Not even AV1 decode

    At the end I will be buying Xe if it turns out to fare well against my current AMD card...
    Shhh! Don't distract them. They (AMD) need to focus on profits and that is in server CPU's. Once the profits arrive, then they can write as much video processing software as anyone wants.

    Comment


    • #42
      Originally posted by starshipeleven View Post
      Does it compress 1080p significantly better? Because if it compresses better only if you have a large resolution then it's situational.
      I don't know what it does to FHD (and neither do you, so you can't proclaim it DoA). Still, by the time the codec becomes more mainstream, so will 4k (even if it still won't be the majority of streams).
      Last edited by bug77; 10 July 2020, 10:24 AM.

      Comment


      • #43
        Originally posted by bug77 View Post
        I don't know what it does to FHD (and neither do you, so you can't proclaim it DoA).
        Let's call it "wild stab in the dark" given how it went with h265.

        Still, by the time the codec becomes more mainstream, so will 4k (even if it still won't be the majority of streams).
        4k streams will become mainstream only with much better internet access, even with h266, as the starting media is larger.

        Comment


        • #44
          Originally posted by starshipeleven View Post
          4k streams will become mainstream only with much better internet access, even with h266, as the starting media is larger.
          The way it's done today, a 4k stream only needs double the bit rate of FHD (because if the pixels are smaller, why not crush more details?). If you can make that 1.5x the bit rate while also upping the quality a bit, that will make 4k more accessible.

          Comment


          • #45
            Anyone mentioning bit depth? The lowest AV1 profile mandates 10-, or is it 12-bit support, so this should mean finally decent support for that!

            For those that may not know, dav1d still has no optimizations for it (except on ARM), which is probably why it's not being used yet. At best, any software impl will have halved SIMD throughput, whereas widening a signal path is what hardware does best.

            Comment


            • #46
              Originally posted by bug77 View Post
              The way it's done today, a 4k stream only needs double the bit rate of FHD (because if the pixels are smaller, why not crush more details?).
              That's a bullshit assumption right there, which is one of the reasons I said network won't hold true 4k.

              It may be somewhat correct for PC where it seems like 21'' 4k monitors are a thing, but it is NOT correct for midrange and higher-end 4k TVs.

              Your pixels aren't smaller, your screen is larger with a similar PPI.

              Comment


              • #47
                Originally posted by LoveRPi View Post
                Zoom has it's own video decoder / encoder path by default.
                Really? This must be why the slow framerate on my mom's 12-year-old computer...

                Comment


                • #48
                  Originally posted by discordian View Post
                  I mean I hope EVC succeeds, but why?
                  I hope it doesn't. I feel like EVC will be a worse H.264, and probably a worse VP8.

                  Comment


                  • #49
                    Originally posted by Ipkh View Post
                    Nothing stops AMD from implementing a similar encode/decode path for their Compute cores.
                    Apathy stops them.

                    Comment


                    • #50
                      Originally posted by Orphis View Post

                      I mean, what do I know, I just work on the video pipeline of one of those softwares and reviewed the patches pertaining to HW acceleration...

                      Looks like you know my daily work better than I do! 🤯​​​
                      Video codec people are a small circle. There are more senators than codec people worth their salt. Any decent internet/network video company knows that they need full control of the tunables on the encoder side that your run of the mill encoders do not offer. Video pipeline is mostly used for decoding and any conformant decoder should accept any stream unless it's a profile that's not supported like HP or SVC. You also have bitstream problems with hardware decoders that sometimes it's less of a hassle just to handle it in software unless you are Apple with your own vertically integrated technologies and dedicated programmable controllers.

                      Comment

                      Working...
                      X