Announcement

Collapse
No announcement yet.

AMD Lands AV1 Decode For Radeon RX 6000 Series In Mesa

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • flashmozzg
    replied
    Originally posted by microcode View Post

    Well, I don't know that it's ironic exactly, but decoding AV1 is mostly doable on common CPUs, particularly the ones likely to be paired with a brand new discrete GPU, whereas encoding in realtime is essentially off the table right now for most hardware.
    For [email protected] fps, sure. For [email protected] fps mostly sure (as long as you have AVX2 you'll have no problems with it). But something like [email protected] fps can be a real challenge even for current top CPUs. Stil not THAT big of a problem for powerful desktop, but it's crucial for laptops.

    Also, HW decoding means possible support for DRM which is crucial for streaming in HD.

    Leave a comment:


  • Drago
    replied
    Originally posted by agd5f View Post

    It was added to VA-API eventually, just not in time for when we needed it. VA-API is an Intel managed API.
    Thanks, that makes sense. Its pity that intel & nvidia are pulling the rug to their corner. Only AMD seems to care about standards.
    Can you disclose if Cezanne is going to support AV1 decode? I have committed to Renoir desktop and I am hoping can have AV1 with only CPU upgrade. Thanks.

    Leave a comment:


  • Nille_kungen
    replied
    Does this work with Tizonia?

    Leave a comment:


  • OneTimeShot
    replied
    Originally posted by piotrj3 View Post
    Thing is normal users tend to be on rather "fast" side (x265) or really fast side (hardware encoders for cameras, x264, nvenc etc.) fact phone costs 20-40 cents more is no diffrence pretty much, while streaming/sharing video companies tend to be on slow side (+totally royalty free). .
    Normal users don't care about any of this. They take whatever file comes out of their camera and upload it to YouTube. If they get any choices in my experience they'll usually choose MPEG4 or AVI. We're well past the point that codec choice matters to the average user - that's why the codec wars are actually already over...

    Leave a comment:


  • DanL
    replied
    Originally posted by pal666 View Post
    i'm pretty sure it's used by default video player on gnome
    So again, it's irrelevant.

    Leave a comment:


  • chithanh
    replied
    Originally posted by pal666 View Post
    i'm pretty sure it's used by default video player on gnome
    But not many people use that for watching videos. Most AV1 content that people are going to encounter is streamed via the browser. And Firefox switched away from gstreamer a while back.

    Leave a comment:


  • piotrj3
    replied
    Originally posted by brad0 View Post

    tl; dr. rambling non sense. Too many assumptions. You're wrong.
    That is what forums are for, rambling, discussions, assumptions maybe predictions. Saying you are wrong without 1 argument, or link or anything makes your post useless.

    I actually tested tons of hardware encoders from Intel QSV, Nvidia NVENC etc. And common point is from efficiency point of view, they all suck, literally best Nvidia turing nvenc comes at efficiency at best par on x264 medium... That in 2020 is not at all impressive. You might think about some hybrid aproach (a bit software a bit hardware encoding) but only tools that I think seriously employed hybrid aproach in encoding with decent efficiency is Adobe ones, and there Intel by benchmarks was gaining 10 maybe 20% more performance then they should have and that is assuming full software path is as efficient as hybrid path that i doubt.

    Hardware encoding is mostly for low power devices or low usage of system resources. AV1 encoding is neither.

    Leave a comment:


  • brad0
    replied
    Originally posted by piotrj3 View Post
    Point is for normal users, you honestly don't care about AV1 much. Honestly from perspective of me, average streamer or any normal home user, X265 is as royalty free as AV1, for personal use you are free to use HEVC hardware encoders freely as well. Problem starts when you want to develop commercial stuff, that is moment you have to start paying royalties. Except that X264/X265 are actually general use codecs and thanks to mostly VideoLan developers represent mostly normal users. Sure it has some patents but they again don't concern you.

    Thing is for most users you try to aproach a tolerable encoding time with as best efficiency as possible. That is what you want when you stream, that is what you want when you encode own videos so they aren't too big etc. But thing is at best you will watch maybe own video 100 times (and share maybe with few friends), and at even 100 times, diffrence of encoding time of 1 hour vs 10 hours with 5-10% diffrence in size is just not worth it. Thing is encoding complexity does concern home user a lot.

    For AV1, encoding speeds are way too long, even full software x265 is miles faster, while potential gains are hard to justify unless you really want to share your video insane amount of times. That is because AV1 represents interests of people who do share video insane amount of times, and do it commercially so suddenly royalty fees are issue.

    It is hard to design fully flexible codec that is great both at fast presets and slow presets comparing to most codecs. Thing is normal users tend to be on rather "fast" side (x265) or really fast side (hardware encoders for cameras, x264, nvenc etc.) fact phone costs 20-40 cents more is no diffrence pretty much, while streaming/sharing video companies tend to be on slow side (+totally royalty free). Also AV1 has to dodge all X264/X265 and other patents, while h264/HEVC can take advantage of them for speed mostly.

    At least AV1 has been designed to have acceptable decoding times, but again that is concern of big companies more so no one complains "video shutters".

    Similar story there is x264/x265 vs VP9, sure VP9 sounds free and great, but if you do any in-house testing of quality/encoding time/size, you realize you shouldn't use at home VP9 ever.
    tl; dr. rambling non sense. Too many assumptions. You're wrong.

    Leave a comment:


  • piotrj3
    replied
    Originally posted by brad0 View Post

    Sure, if you redefine normal users to something it doesn't mean and only within specific parameters. That's not how video codecs work. This is the same kind of misguided argument I hear from fools that try to defend crappy cable and it's limited upstream capacity and how only 1% of 1% upload anything so no one needs anything more. That's not how things work. No clue.
    Point is for normal users, you honestly don't care about AV1 much. Honestly from perspective of me, average streamer or any normal home user, X265 is as royalty free as AV1, for personal use you are free to use HEVC hardware encoders freely as well. Problem starts when you want to develop commercial stuff, that is moment you have to start paying royalties. Except that X264/X265 are actually general use codecs and thanks to mostly VideoLan developers represent mostly normal users. Sure it has some patents but they again don't concern you.

    Thing is for most users you try to aproach a tolerable encoding time with as best efficiency as possible. That is what you want when you stream, that is what you want when you encode own videos so they aren't too big etc. But thing is at best you will watch maybe own video 100 times (and share maybe with few friends), and at even 100 times, diffrence of encoding time of 1 hour vs 10 hours with 5-10% diffrence in size is just not worth it. Thing is encoding complexity does concern home user a lot.

    For AV1, encoding speeds are way too long, even full software x265 is miles faster, while potential gains are hard to justify unless you really want to share your video insane amount of times. That is because AV1 represents interests of people who do share video insane amount of times, and do it commercially so suddenly royalty fees are issue.

    It is hard to design fully flexible codec that is great both at fast presets and slow presets comparing to most codecs. Thing is normal users tend to be on rather "fast" side (x265) or really fast side (hardware encoders for cameras, x264, nvenc etc.) fact phone costs 20-40 cents more is no diffrence pretty much, while streaming/sharing video companies tend to be on slow side (+totally royalty free). Also AV1 has to dodge all X264/X265 and other patents, while h264/HEVC can take advantage of them for speed mostly.

    At least AV1 has been designed to have acceptable decoding times, but again that is concern of big companies more so no one complains "video shutters".

    Similar story there is x264/x265 vs VP9, sure VP9 sounds free and great, but if you do any in-house testing of quality/encoding time/size, you realize you shouldn't use at home VP9 ever.
    Last edited by piotrj3; 18 November 2020, 02:09 AM.

    Leave a comment:


  • zanny
    replied
    Originally posted by microcode View Post

    Well, I don't know that it's ironic exactly, but decoding AV1 is mostly doable on common CPUs, particularly the ones likely to be paired with a brand new discrete GPU, whereas encoding in realtime is essentially off the table right now for most hardware.
    Which might give you a hint that the silicon cost of a viable and actually useful AV1 hardware encoder is going to be huge. Hardware encoders are getting harder to justify graduating to these newer codecs because the fixed function logic required to make the resultant bitstream even remotely comparable in encode efficiency to software implementations gets more and more complex. I honestly think we are probably already at the point where having a discrete PCI-E hardware encoder expansion card just for AV1 probably makes way more sense than integrating into the VCEs of modern GPUs. As it is most of the hardware h265 implementations are only comparable to x264 at a fairly high CRF at their best, especially when targeting anything above 1080p/60 fps. AV1s substantively increased complexity means that having an encoder that is only getting compression efficiency comparable to 15 year old tech in software is largely a waste of time. But I have a hard time believing anyone is going to spend that much discrete silicon on video coding to make a truly compelling encoder on consumer hardware any time soon.

    Leave a comment:

Working...
X