Announcement

Collapse
No announcement yet.

VDPAU Video Playback For The Radeon RX Vega On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by M@GOid View Post
    Didn't Nvidia already abandoned VDPAU? VAAPI already works with AMD cards, so people should look for that instead.
    You should in practice use vdpau with AMD currently for decode.
    ffmpeg vaapi decode leaks mem.
    mpv recently removed vaapi dec + opengl presentation (they want egl) so you have to use --vo=vaapi which isn't as feature full. It also uses ffmpeg so leaks.

    Comment


    • #22
      Originally posted by Nille_kungen View Post
      Seems interesting, but is there any video player which supports or plans to support omx decode?
      Last edited by Masush5; 15 August 2017, 08:12 PM.

      Comment


      • #23
        Originally posted by 89c51 View Post

        is there any ETA for vp9 support?? It been promised a long time ago from amd -with a FW update- but no news since.
        Would enabling this via a fw update even be possible? Doesn't video decode hw use fixed function blocks?

        Comment


        • #24
          Originally posted by M@GOid View Post

          So the Geforce 10xx series do not have video decode acceleration on Linux? Uau.
          Sort of. As I said before, Nvidia ended VDPAU development at the end of 2015, which was before the Pascal cards launched. They didn't make an official announcement of any sort, but Nvidia's current guy who does the driver development stuff made a presentation at one of the GPU tech conferences that said they were going to replace VDPAU with NVDECODE in 2016, in order to push CUDA for video decoding and to have a single, high-performance API for multiple platforms. Above 1080P, it's pretty much non-functional and gives errors, off-loading the decoding to the CPU because the VDPAU software was never properly finished for 4K content and contains errors, so the decoders can't properly recognize UHD content and decode it. Oddly enough, Nvidia has continued to add VDPAU feature-sets E and F to the drivers - or so the release notes say - despite no active VDPAU development. I've got VDPAUINFO reports pulled from all the recent cards and none of them have proper VDPAU decoding above 1080P, with any drivers. Nobody has implemented NVDECODE in media players yet, but the predecessor - which is just one part of the NVDECODE functions - called CUVID is enabled in MPV and can be used with the latest FFMPEG 3.X build. From what I've read and asking around, the devs who run VLC don't want to implement it at this time, as they are focusing on VLC for Windows and Android and think VDPAU is good enough. Since distros like Ubuntu aren't using the 3.X builds of FFMPEG, it's not going to come until they move up to the newer versions that support it.

          Comment


          • #25
            Michael Can you also post vainfo please? That would show what VA-API functionality is exposed.

            And like others wondering above, what's going on with VPx? Is it firmware shortcoming, or the hardware actually has no VPx decoder?

            Comment


            • #26
              AMD uses the video decoding functions and GPU shaders to decode VP9, a hybrid process. There is no hardware decoder on their ASIC at this time.

              Comment


              • #27
                Originally posted by TheLexMachine View Post
                AMD uses the video decoding functions and GPU shaders to decode VP9, a hybrid process. There is no hardware decoder on their ASIC at this time.
                Why not? Intel support VPx in the hardware for several years already. And does libva / Mesa support support that hybrid process?

                Comment


                • #28
                  Originally posted by shmerl View Post

                  Why not? Intel support VPx in the hardware for several years already. And does libva / Mesa support support that hybrid process?
                  Incorrect. Intel has been using hybrid decoding for years, because Google had no reference decoder hardware design until 2014, which made hardware support difficult as it takes time to integrate it into GPU designs. VP9 was released in 2012. It wasn't until Kaby Lake that they got full hardware decoding of VP9.

                  Comment


                  • #29
                    Originally posted by TheLexMachine View Post

                    Incorrect. Intel has been using hybrid decoding for years, because Google had no reference decoder hardware design until 2014, which made hardware support difficult as it takes time to integrate it into GPU designs. VP9 was released in 2012. It wasn't until Kaby Lake that they got full hardware decoding of VP9.
                    So what stopped AMD from doing it in Vega? Was it designed before Kaby Lake?

                    Comment


                    • #30
                      Originally posted by shmerl View Post

                      So what stopped AMD from doing it in Vega? Was it designed before Kaby Lake?
                      Ask Sir Bridgeman. I think it was probably the fact that AMD was in a bit of a crisis and had to focus on the important things. Technically speaking, AMD sold all the video decoder stuff to Broadcom back in 2006, so they lost the personnel and R&D that ATI had built up for years. Had they kept it, I think they would have had a much better IP portfolio and resources to handle today's multi-media landscape. Back then, ATI cards were very much considered the Go-To choice for XP media center PCs and they had the whole CableCard thing going for them too, at least until AMD cut that shit off.

                      Comment

                      Working...
                      X