Announcement

Collapse
No announcement yet.

Hi10P Support Proposed For VDPAU

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    So is this something that uses shaders / stream processors to do the decoding? I thought that was less efficient than just using the CPU?

    I'm not trying to be a funny guy here, it's just that hi10p encodes interest me and my understanding is that there wasn't any hardware in the universe yet that could do hi10p decoding. ARM, AMD, NVIDIA, mobile, Qualcomm, Apple, supercomputers, whatever.

    Comment


    • #17
      This code passes it to the hw, shaders are not used.

      Comment


      • #18
        Originally posted by curaga View Post
        This code passes it to the hw, shaders are not used.
        So what on the hw decodes it?

        Comment


        • #19
          I mentioned this on the #mpv-player Freenode channel earlier on today, and JEEB (one of the admnis on the CCCP forums) had this to say:
          <aphirst> have you guys already discussed this today? http://lists.freedesktop.org/archive...er/047084.html
          <aphirst> the claim is the possibility to decode Hi10P using VDPAU on AMD graphics hardware, thanks to some state tracker stuff implemented for the sake of OpenMAX
          * sL1pKn07 is now known as sL1p|OuT
          <JEEB> aphirst, isn't that just basically adding support to the framework to note Hi10P stuff as such?
          <JEEB> so that if some kind of hardware that supports it comes around it could be handled, and otherwise just thrown away
          <JEEB> decoding in this case most probably means "parse" in this case methinks
          <JEEB> yeah, seems so
          <JEEB> I would be very surprised if anything on the end-user level of things would have Hi10P decoding support
          <JEEB> because the only hardware I actually know that can handle it costs a LOT
          <JEEB> (and intra only in many cases, as it's meant purely for the "pro" use cases of high bit rate 10bit H.264)
          <aphirst> so you'd say that this, in fact, isn't going to give any tangible benefit
          <aphirst> (to 'typical' hardware owners, now)
          <JEEB> that patch has absolutely nothing to do with actual picture decoding
          <JEEB> it does let stuff actually distinguish between Hi10P and the other normally used H.264 profiles, though
          <aphirst> rightyho
          <aphirst> that clears that up for me
          <aphirst> it smelled a lot of 'too good to be true' but i sure as hell couldnt tell for myself
          * Oleg_ (~mythtv@pool-71-183-189-68.nycmny.east.verizon.net) has joined
          <aphirst> thanks mang
          <Oleg_> we need to type ./update if we wanna sync our local git copy with the latest updates?
          <JEEB> basically it seems like there was already a return value for Hi10P, but before now such stuff would have gotten output as "unknown stuff"
          <JEEB> and now it actually outputs "this is Hi10P AVC"
          <JEEB> and the second patch then adds the Hi10P value into the list of things "this is AVC, yo" somewhere else
          <JEEB> basically unknown -> Hi10P AVC
          <aphirst> what effect would this have on the way players fallback to software decoding of this content?
          <aphirst> does it simplify the logic anywhere?
          <aphirst> is it completely transparent?
          <JEEB> depends, but it does let whatever is calling this stuff get some extra info
          <JEEB> I have no idea what APIs these are that got changed
          <JEEB> and most players already check the profile on their side to begin with methinks ^^;
          <aphirst> so basically
          <aphirst> wow_its_fucking_nothing.jpg
          <JEEB> yes and no
          <JEEB> it's a useful change, since now the API gives some extra info
          <JEEB> it's a non-useful change for end users tho :P
          <aphirst> but its not a "groundbreaking" one
          <aphirst> well, extra info is good
          tl;dr he reckons nothing currently indicates that this actually involves offloading Hi10P decoding to current consumer GPU hardware

          What exactly _does_ this do, then?

          Comment


          • #20
            Originally posted by johnc View Post
            So what on the hw decodes it?
            A chip hard-wired to decode video in particular formats.

            Comment


            • #21
              Originally posted by archibald View Post
              A chip hard-wired to decode video in particular formats.
              So the cards have an ASIC for hi10p decoding? Why was this never advertised or exposed in any of the other drivers? (Catalyst, Windows, etc.)

              Comment


              • #22
                Originally posted by johnc View Post
                So the cards have an ASIC for hi10p decoding? Why was this never advertised or exposed in any of the other drivers? (Catalyst, Windows, etc.)
                No it's integrated on the GPU die. It's called UVD.

                http://en.wikipedia.org/wiki/Unified_Video_Decoder

                Comment


                • #23
                  Does anyone know the answer to johnc's question:
                  What AMD GPUs can decode Hi10P videos with its dedicated hardware video decoder (ie. UVD)?

                  Or am I misunderstanding the article and this is just to put in hooks so when GPUs have hardware video decoders that support Hi10P, VDPAU will be able to make use of it at that point?

                  Comment


                  • #24
                    You guys aren't getting what johnc is asking. It's since when do these dedicated decoders decode hi10p and why isn't it advertised? Because common knowledge is that hi10p isn't supported.

                    So again: Why isn't hi10p support advertised anywhere? And where a list of which hardware exactly supports hi10p?

                    Comment


                    • #25
                      The "fun" part of this news is that according to an nvidia engineer here: http://www.nvnews.net/vbulletin/show...hp?t=164684#12
                      it is not supported by the decoding hardware of nvidia cards.

                      I also wonder why an ASIC is present on Radeon-cards and this was never advertised. Some year ago I was specifically searching for this to select hardware capable of this - but I could not find anything.

                      Comment


                      • #26
                        Originally posted by Gusar View Post
                        You guys aren't getting what johnc is asking. It's since when do these dedicated decoders decode hi10p and why isn't it advertised? Because common knowledge is that hi10p isn't supported.

                        So again: Why isn't hi10p support advertised anywhere? And where a list of which hardware exactly supports hi10p?
                        In the OpenMAX thread AMD engineers (agd5f & Deathsimple) said that it should work. Meaning that all is done in the dedicated engine (UVD) and not on shaders or software, etc. I hope this clears it up. I'm eager to see it in action!
                        I think the marketing department never really understood that anime fans would kill for this feature...

                        Comment


                        • #27
                          Originally posted by HokTar View Post
                          I hope this clears it up.
                          No, it doesn't. It doesn't say which exact hardware supports it. Can't be all of it, because if HD4000 or HD5000 supported it, it would for sure be known by now. If support is there, it must be fairly new. And that's a big "if", I'm skeptical that there's support on *any* hardware, because if AMD added hi10p but didn't mention it to anyone, that's quite high levels of stupidity.

                          Comment


                          • #28
                            Originally posted by Gusar View Post
                            No, it doesn't. It doesn't say which exact hardware supports it. Can't be all of it, because if HD4000 or HD5000 supported it, it would for sure be known by now. If support is there, it must be fairly new. And that's a big "if", I'm skeptical that there's support on *any* hardware, because if AMD added hi10p but didn't mention it to anyone, that's quite high levels of stupidity.
                            Guessing:
                            It could be an limitation of DXVA on windows. This could mean that what everybody assumed was hardware limitation was in fact just software limitations.


                            The only known fact is that hardware from nvidia does not support hi10p in hardware.


                            Looking at http://www.amd.com/us/Documents/UVD3_whitepaper.pdf it does mention 10 bit colour processing. Considering this and the fact that ATI was famous for their video support, I would not be very surprised if all versions of UVD turns out to supports 10 bit hardware decode.

                            Comment


                            • #29
                              I guess we'll just have to wait for somebody to try it.

                              Comment


                              • #30
                                Originally posted by Gusar View Post
                                And that's a big "if", I'm skeptical that there's support on *any* hardware, because if AMD added hi10p but didn't mention it to anyone, that's quite high levels of stupidity.
                                Never underestimate the stupidity of a corporation. Remember how the hw also supported H.264 level 5 but the software didn't allow it? Or how Nvidia's cards could do four screens in Mosaic mode but the drivers limit it now to three.

                                Comment

                                Working...
                                X