Announcement

Collapse
No announcement yet.

XBMC Gains Crystal HD 1080p Decoding Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Well i would say partly for UVD as there are too many files which do NOT work correctly. "Full" is really something else.

    Comment


    • #52
      Originally posted by gbeauche View Post
      IIRC, only UVD has full video decode at VLD level. Shaders based acceleration was at MoComp level, wasn't it? Why AMD doesn't make them available as source code, or even through some precompiled binary under Linux? I have yet to see a GPU based H.264 decoder working with shaders only. Intel is trying though I doubt we will get decent performance.
      MC maps most easily to shaders so it's the lowest hanging fruit, but you can do more than that with the 3D engine. Gallium already has a shader-based implementation for video decode. The r300g gallium driver (for r3xx-r5xx chips) is coming along nicely; it might be working at this point with the video decode stuff.

      As I said before, UVD is tied up in DRM stuff so we can't release any code for it until its been reviewed to seem what we can safely release. In the meantime, fglrx now supports UVD via va-api.

      Originally posted by gbeauche View Post
      The advantage of using UVD is that we have full decode at VLD level and much more efficient power usage. Otherwise, the gain is not very interesting. e.g. instead of using 90W CPUs, we would still use a certain portion of it + 130W GPU with a shaders approach.
      UVD still uses power. I'm not sure how much more efficient compared to the 3D engine. You could lower the clocks on the 3D engine to the minimum needed to hit your decode targets which would save power. Using shaders would still save power overall compared with doing the decode on the CPU only.

      Comment


      • #53
        > Asus O!Play

        It appears that the Asus O!Play only has composite out, no s-video,
        no component, no DVI, HMDI, or DiiVA. :-(

        ----------

        There is obviously some advantage to using UVD or ATI wouldn't have
        bothered designing and building it.

        Possible advantage #1 - power consumption

        According to

        the "Broadcom Hardware Decoder has a TDP of ~2 watts"

        How much power does ATI's UVD need? Something roughly similar? (Say,
        under 5 Watts?)

        Does decoding using shaders have any hope of approaching that power
        level? (I suspect not.)

        Possible advantage #2 - quality

        Does decoding using shaders result in the same visual quality as
        using a proper hardware decoder such as UVD or Crystal HD?

        If we go the Broadcom hardware decoder route we can use anyone's GPU.
        So ATI needs to make it possible to decode using UVD and FLOS software
        or lose business. And no, a binary-only driver is not acceptable.

        So either find a way to document UVD1/2 or get a UVD3 out the door with the
        useless decryption stuff separated out and powered off when not in use to
        make it easy to document the decoder without getting a chair thrown at you.

        Comment


        • #54
          Originally posted by Dieter View Post
          > Asus O!Play

          It appears that the Asus O!Play only has composite out, no s-video,
          no component, no DVI, HMDI, or DiiVA. :-(


          Input:

          DC Power In
          1xUSB 2.0 Port
          1xUSB 2.0 / eSATA Combo Port
          RJ-45 LAN Port

          Output:
          Composite Video
          Composite Audio L/R
          S/PDIF Out
          HDMI 1.3

          Last edited by deanjo; 05 January 2010, 04:46 PM.

          Comment


          • #55
            Originally posted by Dieter View Post
            If we go the Broadcom hardware decoder route we can use anyone's GPU.
            So ATI needs to make it possible to decode using UVD and FLOS software
            or lose business. And no, a binary-only driver is not acceptable.
            I'm not really sure how we lose business by not opening up UVD. If you plan to use the broadcom chip, you still need a GPU to drive your display. AFAIK, there aren't really any other open source options out there at the moment. OTOH, we would really lose business if opening UVD somehow allowed bluray, etc. to be hacked.

            Using the 3D engine may not be quite as power efficient as dedicated hw (I don't know the actual numbers off hand), but it's still better than the CPU alone and provides an open source solution. Quality should be the same either way and shaders also give you the possibility of using alternative algorithms should you so choose.

            Comment


            • #56
              >> Asus O!Play
              > HDMI 1.3

              Ah, it has HDMI. I stand corrected, thanks.
              That's what I get for looking at a crappy web photo.
              Unfortunately I need s-video as composite gives dot-crawl.

              Comment


              • #57
                > I'm not really sure how we lose business by not opening up UVD.
                > If you plan to use the broadcom chip, you still need a GPU to drive your display.

                ATI has been documenting 3D, but there are plenty of people that do
                not need or want 3D.

                With the broadcom chip all I need is a framebuffer, correct?
                And pretty much any GPU has a FLOSS driver that can do
                framebuffer, correct?

                The obvious solution is to seperate the digital restrictions mangling
                from the video decoding in UVD3, and document the video decoding.

                Comment


                • #58
                  Originally posted by Dieter View Post
                  > I'm not really sure how we lose business by not opening up UVD.
                  > If you plan to use the broadcom chip, you still need a GPU to drive your display.

                  ATI has been documenting 3D, but there are plenty of people that do
                  not need or want 3D.
                  You may not want OpenGL, but newer chips only have a 3D engine and it's used for everything (2D, 3D, compute, buffer moves, etc.), so you need it for everything except software-only framebuffer.

                  Originally posted by Dieter View Post
                  With the broadcom chip all I need is a framebuffer, correct?
                  And pretty much any GPU has a FLOSS driver that can do
                  framebuffer, correct?
                  If the broadcom chip only accelerates decode, you still need driver code to handle rendering (scaling, colorspace conversion) and post processing (filters, brightness, sharpness, gamma, etc.). In which case you still need the 3D engine for that part.

                  Originally posted by Dieter View Post
                  The obvious solution is to seperate the digital restrictions mangling
                  from the video decoding in UVD3, and document the video decoding.
                  Right. However chip design starts years in advance of the release so lots of this stuff was started years ago. We are attempting to make future hardware friendlier to open source, but it may take a while to actually see the results.

                  Comment


                  • #59
                    Originally posted by agd5f View Post
                    I'm not really sure how we lose business by not opening up UVD. If you plan to use the broadcom chip, you still need a GPU to drive your display. AFAIK, there aren't really any other open source options out there at the moment. OTOH, we would really lose business if opening UVD somehow allowed bluray, etc. to be hacked.
                    Near as I can tell, the idea was to make Intel that open source HD decoding solution, or did I miss something? Seems rather pointless to buy an AMD w/UVD and pair it with Broadcom.

                    Sadly, I'm sure AMD would be slammed by all their nsaty DRM agreements. AACS is broken, BD+ is broken, HDCP is broken, BluRay can barely get more broken but the movie business is in denial.

                    Comment


                    • #60
                      Originally posted by Kjella View Post
                      Near as I can tell, the idea was to make Intel that open source HD decoding solution, or did I miss something? Seems rather pointless to buy an AMD w/UVD and pair it with Broadcom.
                      How is Intel's open decoding solution any different from AMD's open decoding solution? Both use shaders. And if you you are using the broadcom for decoding, then it doesn't matter whose GPU you use, so why would Intel be any better choice than AMD? The AMD chip would give you better 3D in addition.

                      Comment

                      Working...
                      X