Page 2 of 5 FirstFirst 1234 ... LastLast
Results 11 to 20 of 45

Thread: Hi10P Support Proposed For VDPAU

  1. #11
    Join Date
    Mar 2011
    Posts
    92

    Default

    Quote Originally Posted by fritsch View Post
    For those playing with it, i missed to squash a commit before posting, which resulted in a build error. Should work now: http://sprunge.us/WacV
    Thanks. Compiled fine with trivial fix of placing ":" instead of ";" in second part of first patch. However does not seem to get picked up by XBMC where it still is using "ff-h264" for decoding at least in my case

  2. #12
    Join Date
    Feb 2012
    Location
    Barcelona, Spain
    Posts
    322

    Default

    good news, because I have more of 400 gb of anime...

  3. #13
    Join Date
    Jun 2006
    Location
    Portugal
    Posts
    542

    Default

    Quote Originally Posted by ryszardzonk View Post
    Thanks. Compiled fine with trivial fix of placing ":" instead of ";" in second part of first patch. However does not seem to get picked up by XBMC where it still is using "ff-h264" for decoding at least in my case
    I heard somewhere that XBMC was forcing all 10-bit to sw decoding as it never (until now it seems) worked.

    But yeah, I can't believe I'm writing this, but if it works fine I might end trading the nvidia gpu on my media center pc with an amd one, as right now I need a very heavy overclock (+1GHz) to play 10-bit 1080p content perfectly (not a very recent CPU).

  4. #14
    Join Date
    Feb 2008
    Location
    Linuxland
    Posts
    5,286

    Default

    Quote Originally Posted by AdamW View Post
    So help me out here - gotta admit I'm not really au fait with all this 'state tracker' stuff. Executive summary: does this mean we'll get trouble-free HW-accelerated decoding of hi10p in XBMC running on top of...open drivers? Proprietary drivers? In the fairly near term? What would be the oldest HW that would support this decoding? Thanks!
    Open drivers, near term, all AMD cards with UVD 2.2 and up:
    http://en.wikipedia.org/wiki/Unified_Video_Decoder

    Most of hd4000, everything of hd5000 and up.

  5. #15
    Join Date
    Feb 2008
    Location
    Linuxland
    Posts
    5,286

    Default

    (forgot to mention: my hw list was for accelerated decoding in general. I don't know if all that hw is 10-bit capable.)

  6. #16
    Join Date
    May 2011
    Posts
    1,599

    Default

    So is this something that uses shaders / stream processors to do the decoding? I thought that was less efficient than just using the CPU?

    I'm not trying to be a funny guy here, it's just that hi10p encodes interest me and my understanding is that there wasn't any hardware in the universe yet that could do hi10p decoding. ARM, AMD, NVIDIA, mobile, Qualcomm, Apple, supercomputers, whatever.

  7. #17
    Join Date
    Feb 2008
    Location
    Linuxland
    Posts
    5,286

    Default

    This code passes it to the hw, shaders are not used.

  8. #18
    Join Date
    May 2011
    Posts
    1,599

    Default

    Quote Originally Posted by curaga View Post
    This code passes it to the hw, shaders are not used.
    So what on the hw decodes it?

  9. #19
    Join Date
    Jul 2009
    Location
    United Kingdom
    Posts
    39

    Default

    I mentioned this on the #mpv-player Freenode channel earlier on today, and JEEB (one of the admnis on the CCCP forums) had this to say:
    <aphirst> have you guys already discussed this today? http://lists.freedesktop.org/archive...er/047084.html
    <aphirst> the claim is the possibility to decode Hi10P using VDPAU on AMD graphics hardware, thanks to some state tracker stuff implemented for the sake of OpenMAX
    * sL1pKn07 is now known as sL1p|OuT
    <JEEB> aphirst, isn't that just basically adding support to the framework to note Hi10P stuff as such?
    <JEEB> so that if some kind of hardware that supports it comes around it could be handled, and otherwise just thrown away
    <JEEB> decoding in this case most probably means "parse" in this case methinks
    <JEEB> yeah, seems so
    <JEEB> I would be very surprised if anything on the end-user level of things would have Hi10P decoding support
    <JEEB> because the only hardware I actually know that can handle it costs a LOT
    <JEEB> (and intra only in many cases, as it's meant purely for the "pro" use cases of high bit rate 10bit H.264)
    <aphirst> so you'd say that this, in fact, isn't going to give any tangible benefit
    <aphirst> (to 'typical' hardware owners, now)
    <JEEB> that patch has absolutely nothing to do with actual picture decoding
    <JEEB> it does let stuff actually distinguish between Hi10P and the other normally used H.264 profiles, though
    <aphirst> rightyho
    <aphirst> that clears that up for me
    <aphirst> it smelled a lot of 'too good to be true' but i sure as hell couldnt tell for myself
    * Oleg_ (~mythtv@pool-71-183-189-68.nycmny.east.verizon.net) has joined
    <aphirst> thanks mang
    <Oleg_> we need to type ./update if we wanna sync our local git copy with the latest updates?
    <JEEB> basically it seems like there was already a return value for Hi10P, but before now such stuff would have gotten output as "unknown stuff"
    <JEEB> and now it actually outputs "this is Hi10P AVC"
    <JEEB> and the second patch then adds the Hi10P value into the list of things "this is AVC, yo" somewhere else
    <JEEB> basically unknown -> Hi10P AVC
    <aphirst> what effect would this have on the way players fallback to software decoding of this content?
    <aphirst> does it simplify the logic anywhere?
    <aphirst> is it completely transparent?
    <JEEB> depends, but it does let whatever is calling this stuff get some extra info
    <JEEB> I have no idea what APIs these are that got changed
    <JEEB> and most players already check the profile on their side to begin with methinks ^^;
    <aphirst> so basically
    <aphirst> wow_its_fucking_nothing.jpg
    <JEEB> yes and no
    <JEEB> it's a useful change, since now the API gives some extra info
    <JEEB> it's a non-useful change for end users tho :P
    <aphirst> but its not a "groundbreaking" one
    <aphirst> well, extra info is good
    tl;dr he reckons nothing currently indicates that this actually involves offloading Hi10P decoding to current consumer GPU hardware

    What exactly _does_ this do, then?

  10. #20
    Join Date
    Jul 2010
    Posts
    449

    Default

    Quote Originally Posted by johnc View Post
    So what on the hw decodes it?
    A chip hard-wired to decode video in particular formats.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •