Announcement

Collapse
No announcement yet.

Intel Gen12/Xe Graphics Have AV1 Accelerated Decode - Linux Support Lands

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • tildearrow
    replied
    Originally posted by starshipeleven View Post
    To be fair, a 12 year old computer isn't going to have much in the way of hardware encoding acceleration.
    ...with a card from 2016.

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by tildearrow View Post
    Really? This must be why the slow framerate on my mom's 12-year-old computer...
    To be fair, a 12 year old computer isn't going to have much in the way of hardware encoding acceleration.

    Leave a comment:


  • andreano
    replied
    Originally posted by discordian View Post
    Id guess that cameras will pick one format, and TVs will follow. Id further guess that this will be H266 or EVC, as HW for realtime encoding is a design criteria for those.
    Network effects and path dependence: I would agree about the principles, but cameras – hardware encoders in price sensitive consumer devices – is hardly the forefront of development. I would rather bet that software will always come first and more or less decide the game. This is where browsers come in.

    VVC vs EVC in terms of risk: One is a superset of an ongoing licensing fiasco; the other is more defensively built than even AV1.

    Leave a comment:


  • tildearrow
    replied
    Originally posted by mikkl View Post


    Just to make it clear, Tigerlake-U is going to support the main profile of AV1 which means 8/10 bit 4:2:0. This might change with RKL-S next year or future Gen12 versions but TGL-U won't support 12 Bit 4:4:4 unlike VP9. However the main profile will be enough for the likes of youtube, this is a potentially massive feature for any mobile device.
    I can understand that for now, considering AV1 is still fresh...

    What I can't understand is the lack of 4:4:4 H.264 (at least!) encoding on AMD cards...

    Leave a comment:


  • mikkl
    replied
    Originally posted by tildearrow View Post
    AMD! Hurry up!
    ​​​​​​Look! Intel is beating you! What if they eventually end up bringing AV1 encode too?! (on top of 4:4:4 and their high-quality encoder)
    ...yeah, while you are stuck in:

    Just to make it clear, Tigerlake-U is going to support the main profile of AV1 which means 8/10 bit 4:2:0. This might change with RKL-S next year or future Gen12 versions but TGL-U won't support 12 Bit 4:4:4 unlike VP9. However the main profile will be enough for the likes of youtube, this is a potentially massive feature for any mobile device.

    Leave a comment:


  • LoveRPi
    replied
    Originally posted by Orphis View Post

    I mean, what do I know, I just work on the video pipeline of one of those softwares and reviewed the patches pertaining to HW acceleration...

    Looks like you know my daily work better than I do! 🤯​​​
    Video codec people are a small circle. There are more senators than codec people worth their salt. Any decent internet/network video company knows that they need full control of the tunables on the encoder side that your run of the mill encoders do not offer. Video pipeline is mostly used for decoding and any conformant decoder should accept any stream unless it's a profile that's not supported like HP or SVC. You also have bitstream problems with hardware decoders that sometimes it's less of a hassle just to handle it in software unless you are Apple with your own vertically integrated technologies and dedicated programmable controllers.

    Leave a comment:


  • MadeUpName
    replied
    Originally posted by Ipkh View Post
    Nothing stops AMD from implementing a similar encode/decode path for their Compute cores.
    Apathy stops them.

    Leave a comment:


  • tildearrow
    replied
    Originally posted by discordian View Post
    I mean I hope EVC succeeds, but why?
    I hope it doesn't. I feel like EVC will be a worse H.264, and probably a worse VP8.

    Leave a comment:


  • tildearrow
    replied
    Originally posted by LoveRPi View Post
    Zoom has it's own video decoder / encoder path by default.
    Really? This must be why the slow framerate on my mom's 12-year-old computer...

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by bug77 View Post
    The way it's done today, a 4k stream only needs double the bit rate of FHD (because if the pixels are smaller, why not crush more details?).
    That's a bullshit assumption right there, which is one of the reasons I said network won't hold true 4k.

    It may be somewhat correct for PC where it seems like 21'' 4k monitors are a thing, but it is NOT correct for midrange and higher-end 4k TVs.

    Your pixels aren't smaller, your screen is larger with a similar PPI.

    Leave a comment:

Working...
X