Announcement

Collapse
No announcement yet.

Intel Gen12/Xe Graphics Have AV1 Accelerated Decode - Linux Support Lands

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • andreano
    replied
    Anyone mentioning bit depth? The lowest AV1 profile mandates 10-, or is it 12-bit support, so this should mean finally decent support for that!

    For those that may not know, dav1d still has no optimizations for it (except on ARM), which is probably why it's not being used yet. At best, any software impl will have halved SIMD throughput, whereas widening a signal path is what hardware does best.

    Leave a comment:


  • bug77
    replied
    Originally posted by starshipeleven View Post
    4k streams will become mainstream only with much better internet access, even with h266, as the starting media is larger.
    The way it's done today, a 4k stream only needs double the bit rate of FHD (because if the pixels are smaller, why not crush more details?). If you can make that 1.5x the bit rate while also upping the quality a bit, that will make 4k more accessible.

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by bug77 View Post
    I don't know what it does to FHD (and neither do you, so you can't proclaim it DoA).
    Let's call it "wild stab in the dark" given how it went with h265.

    Still, by the time the codec becomes more mainstream, so will 4k (even if it still won't be the majority of streams).
    4k streams will become mainstream only with much better internet access, even with h266, as the starting media is larger.

    Leave a comment:


  • bug77
    replied
    Originally posted by starshipeleven View Post
    Does it compress 1080p significantly better? Because if it compresses better only if you have a large resolution then it's situational.
    I don't know what it does to FHD (and neither do you, so you can't proclaim it DoA). Still, by the time the codec becomes more mainstream, so will 4k (even if it still won't be the majority of streams).
    Last edited by bug77; 10 July 2020, 10:24 AM.

    Leave a comment:


  • edwaleni
    replied
    Originally posted by tildearrow View Post
    AMD! Hurry up!
    ​​​​​​Look! Intel is beating you! What if they eventually end up bringing AV1 encode too?! (on top of 4:4:4 and their high-quality encoder)
    ...yeah, while you are stuck in:

    - No 4:4:4
    - H.264 encode slower than HEVC
    - Lowest quality encoder in the market
    - No VP9 encode and not even decode (except latest chips)
    - Not even AV1 decode

    At the end I will be buying Xe if it turns out to fare well against my current AMD card...
    Shhh! Don't distract them. They (AMD) need to focus on profits and that is in server CPU's. Once the profits arrive, then they can write as much video processing software as anyone wants.

    Leave a comment:


  • vladpetric
    replied
    Originally posted by cl333r View Post
    Isn't this part of AMD's eternal problem of having worse software (but good or better hw) than its competition?
    Some things don't change in ~20 years (back when they were ATI). So yes, eternal problem indeed.

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by bug77 View Post
    What do you mean "very useful"?
    Does it compress 1080p significantly better? Because if it compresses better only if you have a large resolution then it's situational.

    I'm thinking here about Netflix or Amazon being able to stream 4k without crushing details into oblivion.
    even with better compression, a 4k stream is very heavy for most of the world's network infrastructure

    Leave a comment:


  • Orphis
    replied
    Originally posted by bug77 View Post
    On the other hand, seeing how "widespread" AV1 is today, we don't need to worry about h266 for a while. It will all come down to a few years of Michael benchmarking improvements of h266 encoders.
    Quite the opposite! You need to worry about it (and the upcoming AV2) like yesterday. There is a huge latency between the time a new codec is standardized and the time it arrives in silicon and becomes usable by most low power devices. If you wait too long, you'll never be able to meet the market demand.

    Leave a comment:


  • bug77
    replied
    Originally posted by starshipeleven View Post
    unless it is very useful for 1080p (and I doubt it), i really doubt it will see much use. I mean yeah they will add it to their infrastructure in some capacity but who is watching a 8k stream anyway, as even with better compression it's far too large for most internet infrastructure.
    What do you mean "very useful"? What is more useful for a content provider than being able to lower the bit rate without affecting quality, thus lowering their bills?
    I'm thinking here about Netflix or Amazon being able to stream 4k without crushing details into oblivion.

    On the other hand, seeing how "widespread" AV1 is today, we don't need to worry about h266 for a while. It will all come down to a few years of Michael benchmarking improvements of h266 encoders.

    Leave a comment:


  • Orphis
    replied
    Originally posted by LoveRPi View Post

    Zoom has it's own video decoder / encoder path by default.
    Skype has it's own video decoder / encoder path by default.
    Chrome is the only one that uses GPU decode acceleration only and it uses it's own video encode path.
    I mean, what do I know, I just work on the video pipeline of one of those softwares and reviewed the patches pertaining to HW acceleration...

    Looks like you know my daily work better than I do! 🤯​​​

    Leave a comment:

Working...
X