Originally posted by tildearrow
View Post
Announcement
Collapse
No announcement yet.
Intel Gen12/Xe Graphics Have AV1 Accelerated Decode - Linux Support Lands
Collapse
X
-
Originally posted by starshipeleven View PostDoes it compress 1080p significantly better? Because if it compresses better only if you have a large resolution then it's situational.Last edited by bug77; 10 July 2020, 10:24 AM.
Comment
-
Originally posted by bug77 View PostI don't know what it does to FHD (and neither do you, so you can't proclaim it DoA).
Still, by the time the codec becomes more mainstream, so will 4k (even if it still won't be the majority of streams).
- Likes 1
Comment
-
Originally posted by starshipeleven View Post4k streams will become mainstream only with much better internet access, even with h266, as the starting media is larger.
Comment
-
Anyone mentioning bit depth? The lowest AV1 profile mandates 10-, or is it 12-bit support, so this should mean finally decent support for that!
For those that may not know, dav1d still has no optimizations for it (except on ARM), which is probably why it's not being used yet. At best, any software impl will have halved SIMD throughput, whereas widening a signal path is what hardware does best.
Comment
-
Originally posted by bug77 View PostThe way it's done today, a 4k stream only needs double the bit rate of FHD (because if the pixels are smaller, why not crush more details?).
It may be somewhat correct for PC where it seems like 21'' 4k monitors are a thing, but it is NOT correct for midrange and higher-end 4k TVs.
Your pixels aren't smaller, your screen is larger with a similar PPI.
- Likes 1
Comment
-
Originally posted by Orphis View Post
I mean, what do I know, I just work on the video pipeline of one of those softwares and reviewed the patches pertaining to HW acceleration...
Looks like you know my daily work better than I do! 🤯
Comment
Comment