Announcement

Collapse
No announcement yet.

AMD Lands VCN 3.0 Video Encode Support For Navi 2 / Sienna Cichlid

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by tildearrow View Post

    Where is 4:4:4?
    That too, added that: https://en.wikipedia.org/wiki/Video_Core_Next plus some other stuff I knew.

    Comment


    • #12
      Originally posted by Laughing1 View Post

      That too, added that: https://en.wikipedia.org/wiki/Video_Core_Next plus some other stuff I knew.
      How do you know this? Do you work for AMD?

      Comment


      • #13
        Originally posted by tildearrow View Post

        How do you know this? Do you work for AMD?
        Na, I mostly just copied stuff that is known already and put a ? on what might change.

        Comment


        • #14
          Originally posted by tildearrow View Post
          Then what's the point of bumping the number if you did absolutely nothing?!
          We probably did do something... but the open source drivers are not how we announce things.

          Originally posted by tildearrow View Post
          Come on, could AMD speak on this matter? Why is documentation about new versions of VCE/VCN SO ELUSIVE?! D:
          Why would you expect documentation before we even announce the product ? We're trying to get core driver support upstream before launch, and that means we need to reveal as little as possible about any new features in the initial code.

          You know the old saying about how "the art of taxation consists in so plucking the goose as to obtain the largest possible amount of feathers with the smallest possible amount of hissing" ? It's the same here - we try to provide the largest possible amount of support with the smallest possible exposure of the new product.

          We are the Linux driver team, not the marketing team
          Last edited by bridgman; 19 June 2020, 02:18 AM.
          Test signature

          Comment


          • #15
            AMD graphics seemed to be aimed at gamers and not people trying to get work done. A fair enough choice. Just keep it in mind next time you upgrade your graphics card. If your trying to get work done your going to be using a blob no matter whose card you buy so get one that actually does what you need to get done.

            Comment


            • #16
              Originally posted by MadeUpName View Post
              AMD graphics seemed to be aimed at gamers and not people trying to get work done. A fair enough choice. Just keep it in mind next time you upgrade your graphics card. If your trying to get work done your going to be using a blob no matter whose card you buy so get one that actually does what you need to get done.
              What about people like me who likes to watch movies exactly as much as playing games ?
              What GPU should I buy when I want to enjoy both ?
              And I care too much about my privacy and my time to use a blob.
              I intentionally choose AMD instead of Nvidia because of its high quality open source drivers. I don't want to waste my time with a blob.
              Is it really that hard for them besides gaming to add support for watching movies with the best quality possible and performance like Hardware decoding for the best codecs (H.264 / H.265, VP9, AV1) with resolutions up to 8K and framerate up to 120 FPS, possibly also with HDR ?

              I don't know about others but I'm tired of hearing that this card doesn't support SR-IOV for virtualization because only the pro ones do or that this card doesn't support decoding high quality movies because it's mainly a gaming GPU.

              In my opinion, if it has the horsepower for 3D gaming, the it should be able to easily decode any 2D movie.
              They just need to work a bit on the video playback too.

              Comment


              • #17
                Originally posted by tildearrow View Post
                Then what's the point of bumping the number if you did absolutely nothing?!

                Ugh AMD, what a disappointment. Seriously, Intel and NVIDIA have 4:4:4 encode and you do not yet?! And your encoder is still the worst out of all of them?
                And why do you have to cripple encoding support on every new iteration of your encoding block? Like I mean, VCE 3.x slowed down H.264 encoding just to allow HEVC encoding to exist...
                Neither NVIDIA or Intel had to do that, ever.
                And when you did implement 4:4:4, it was only for freaking I-frames (so not P-frames) and therefore too expensive on the bandwidth...
                And even worse, this only lasted for the entirety of VCE 2.0's life... ...after that, 4:4:4 encoding support went away!
                And it was so undocumented that I think only the PlayStation 4 ever used it for wireless display!

                Yeah, I have typed too many "and"s but the truth is that it annoys me so much when AMD brings out a new version of their video encoder FOR NO FREAKING REASON AT ALL.

                Like why is your encoder so horrible? Do you even have a team for it or do you rely upon external IP for it? Really?

                I mean I know you may have the best open-source support (besides Intel), but sadly NVIDIA and Intel are still winning the hardware encoding race with their 8K HEVC 12-bit and even lossless predictive 4:4:4 support!
                Yeah I mean they may not have AV1 yet or whatever, but at least they have 4:4:4!

                And you know why do I want 4:4:4 encoding?
                Because I record my screen at times and I want pixel color perfection when just recording the desktop.

                Come on, could AMD speak on this matter? Why is documentation about new versions of VCE/VCN SO ELUSIVE?! D:
                The answer should be obvious, AMD has obviously made the decision that the only way to fight Intel successfully is by having more cores at any given price point than Intel but the problem is that there are few tasks outside of HPC that can really utilize high core counts, compiling code, 3d rendering and video encoding are pretty much the only non-HPC tasks that can benefit from high core counts.

                Because of this, AMD is not going to cannibalize its cpu sales by releasing gpu's that can do the job faster for a fraction of the cost. In fact, the only reason AMD even continues offering hardware encoding is because NVIDIA has invested so much of Intel's money* into hardware encoders.

                I would argue Intel feels the same way, they don't want to cannibalize their cpu sales either but NVIDIA has forced their hand.

                *For those that don't know, back in the late 2000's NVIDIA tried to enter the x86 cpu market but Intel stopped them. NVIDIA claimed that they had an x86 license that allowed them to make cpu's because they had a license to make chipsets, Intel said no, NVIDIA sued, and Intel paid them 1.5 billion over 5 years to settle the suit. NVIDIA turned around and invested that money in HPC, developing CUDA and eventually NVENC.

                The only one of the 3 that really has any incentive to bring a hardware encoder to market is NVIDIA, though Intel's QSV is quite good.

                For me the big question is who will bring a hardware AV1 encoder to market first and if I had to bet money I would say NVIDIA.

                Comment


                • #18
                  Originally posted by MadeUpName View Post
                  AMD graphics seemed to be aimed at gamers and not people trying to get work done. A fair enough choice. Just keep it in mind next time you upgrade your graphics card. If your trying to get work done your going to be using a blob no matter whose card you buy so get one that actually does what you need to get done.
                  Not many people get work done on gpus. Contrary to popular belief fueled by the media, gpgpu, whether CUDA or OpenCL, is not so widespread, even though most tech sites make machine learning look like it is so widespread even your grandma is using it. Also, most people don't encode videos all day either. So AMD is fine. Stop whining.

                  Comment


                  • #19
                    Originally posted by sophisticles View Post
                    For me the big question is who will bring a hardware AV1 encoder to market first and if I had to bet money I would say NVIDIA.
                    I'd say Intel would be the first to implement a hardware AV1 decoder/encoder on the x86 platform.

                    They support VP8/VP9 already, therefore making me speculate.

                    Comment


                    • #20
                      Originally posted by tildearrow View Post

                      I'd say Intel would be the first to implement a hardware AV1 decoder/encoder on the x86 platform.

                      They support VP8/VP9 already, therefore making me speculate.
                      Yeah, but Intel barely supports VP8/VP9 encoding, up until just recently it was only available via VAAPI on Linux and even that was a chore to get working properly, if at all (I'm talking VP9); I've seen recent builds of ffmpeg on Linux that claim to support QSV VP9 but I never got it to work on a Kaby Lake cpu that supposedly supports hardware VP9 encoding.

                      Additionally, Intel, along with NetFlix, seems to be heavily invested in SVT-AV1, so I do not have much hope that Intel will bring a hardware AV1 encoder to the market any time soon.

                      Comment

                      Working...
                      X