Originally posted by starshipeleven
View Post
Announcement
Collapse
No announcement yet.
Intel Gen12/Xe Graphics Have AV1 Accelerated Decode - Linux Support Lands
Collapse
X
-
Originally posted by tildearrow View PostReally? This must be why the slow framerate on my mom's 12-year-old computer...
Leave a comment:
-
Originally posted by discordian View PostId guess that cameras will pick one format, and TVs will follow. Id further guess that this will be H266 or EVC, as HW for realtime encoding is a design criteria for those.
VVC vs EVC in terms of risk: One is a superset of an ongoing licensing fiasco; the other is more defensively built than even AV1.
Leave a comment:
-
Originally posted by mikkl View Post
Just to make it clear, Tigerlake-U is going to support the main profile of AV1 which means 8/10 bit 4:2:0. This might change with RKL-S next year or future Gen12 versions but TGL-U won't support 12 Bit 4:4:4 unlike VP9. However the main profile will be enough for the likes of youtube, this is a potentially massive feature for any mobile device.
What I can't understand is the lack of 4:4:4 H.264 (at least!) encoding on AMD cards...
- Likes 1
Leave a comment:
-
Originally posted by tildearrow View PostAMD! Hurry up!
Look! Intel is beating you! What if they eventually end up bringing AV1 encode too?! (on top of 4:4:4 and their high-quality encoder)
...yeah, while you are stuck in:
Just to make it clear, Tigerlake-U is going to support the main profile of AV1 which means 8/10 bit 4:2:0. This might change with RKL-S next year or future Gen12 versions but TGL-U won't support 12 Bit 4:4:4 unlike VP9. However the main profile will be enough for the likes of youtube, this is a potentially massive feature for any mobile device.
Leave a comment:
-
Originally posted by Orphis View Post
I mean, what do I know, I just work on the video pipeline of one of those softwares and reviewed the patches pertaining to HW acceleration...
Looks like you know my daily work better than I do! 🤯
Leave a comment:
-
Originally posted by Ipkh View PostNothing stops AMD from implementing a similar encode/decode path for their Compute cores.
Leave a comment:
-
Originally posted by discordian View PostI mean I hope EVC succeeds, but why?
Leave a comment:
-
Originally posted by LoveRPi View PostZoom has it's own video decoder / encoder path by default.
Leave a comment:
-
Originally posted by bug77 View PostThe way it's done today, a 4k stream only needs double the bit rate of FHD (because if the pixels are smaller, why not crush more details?).
It may be somewhat correct for PC where it seems like 21'' 4k monitors are a thing, but it is NOT correct for midrange and higher-end 4k TVs.
Your pixels aren't smaller, your screen is larger with a similar PPI.
- Likes 1
Leave a comment:
Leave a comment: