Announcement

Collapse
No announcement yet.

XBMC Gains Crystal HD 1080p Decoding Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by DeepDayze View Post
    I doubt that...most likely you'd have to go to Broadcom's site to get the firmware file and drop that into /lib/firmware.
    It's not hard to automate downloading the firmware

    Comment


    • #42
      Hi, I can answer questions about the firmware status. I'm involved with Broadcom in pushing the source code to GPL/LGPL. The intent is the driver is GPLv2, the Lib is LGPLv2.1 and the firmware is closed source but re-distributable. Since the userland library loads the firmware, the firmware lives in userspace and not kernel space so there is no issue with the kernel and firmware.

      Clarification of the licensing and ability to re-distributable the firmware is coming and will appear on the Broadcom site.

      Comment


      • #43
        Originally posted by Jorophose View Post
        @deanjo: Asus O!Play has like no info on it. The only one that gets any love is the WD TV & WDTV Live.
        Actually there is a lot of info on O!Play, Guys are running optware on it already, installing ipkg's, etc etc. There is even a O!Play hacking channel on freenode.

        Comment


        • #44
          I'm thinking about buying one of these cards on ebay such as this or this one.

          Now, I have a few questions to anyone who has already seen/used one of these:
          Are the items there the real deal (judging by the pictures)?
          Are these cards full size miniPCI-e (29mm x 70mm) or half size miniPCI-e (29mm x 50mm)?

          I'm thinking about puting one of these inside an Intel Atom D945GSEJT motherboard. It has DVI, it has S/PDIF (I can make an opcital header myslef) and with this baby I hope I'd cope with 1080p h.264 easily

          Comment


          • #45
            Originally posted by BlackStar View Post
            I've heard that Broadcom has an HD decoder that needs ~200mW to work. No idea if this is used on this card, but if it is then this card shouldn't consume more than 1W.
            IIRC, this solution is indeed around 1W. BTW, it's the Poulsbo that has a 200 mW HD decoder built-in. ;-)

            Originally posted by tettamanti View Post
            And the interface is extremely simple: pass compressed data via ioctl, read decoded (YUV) data out; all the complexity in handled on the card.
            NV12.

            Originally posted by nightmorph View Post
            Yeah, but mini PCIe to PCIe adapter cards don't exist, except from one company that charges $100 for them.
            http://www.hwtools.net/Adapter/MP1.html -- 25 USD for such an adapter. However, this is only useful to developers. This makes it for a 45 USD solution at best. Real users will get better experience with a cheap NVIDIA card that even supports MPEG-4 Part-2 decoding.

            Comment


            • #46
              To clarify, AMD has released enough documentation to support video decode using the 3D engine on all of our GPUs. On pre-r6xx chips, the 3D engine was always used as those chips did not have the UVD block. Even on cards with the UVD block, the 3D engine can be used. UVD is tricker due to tie in with DRM support for other OSes. A radeon-specific driver would need to be written for a video decode API (VAAPI, VDPAU, XVMC, etc.). Alternatively, the gallium video decode stuff could be used. I'm happy to advise anyone interested in adding support for this, but most of our time in the near future will busy with getting code and documentation for evergreen (HD5xxx) hardware out.

              Comment


              • #47
                Does decoding h264 using 3D shaders cost much more power than using the "official" way?

                Comment


                • #48
                  I'm shocked that broadcom released any specs at all :/
                  Intel must be requesting this. They really want to kill nvidia off in netbooks (and everywhere else).

                  Comment


                  • #49
                    Originally posted by Jorophose View Post
                    I've heard that mini PCIE is just pciex1. Anyone want to try?
                    Electrically, mini PCI-E is actually PCI-E x1 and also a usb slot(Express card i think also uses the same combo, but don't quote me on this). On some netbooks an open mini PCI-E connector will have only usb or only pci-e hooked up.

                    Mostly mini-PCI-E cards will use one of the interfaces, not both. Most wifi cards use the PCI-E interface, and most 3G/Wimax cards use the USB interface. There are exceptions to this.

                    For example, the Intel Wifi/Wimax 5350 uses both the PCI-E and the USB interface at the same time.

                    There are cheap (<$15) converters from mini-pci-E -> PCI-E x1 that do not support the usb interface, as they are essentially a bunch of wires.

                    Comment


                    • #50
                      Originally posted by agd5f View Post
                      To clarify, AMD has released enough documentation to support video decode using the 3D engine on all of our GPUs. On pre-r6xx chips, the 3D engine was always used as those chips did not have the UVD block.
                      IIRC, only UVD has full video decode at VLD level. Shaders based acceleration was at MoComp level, wasn't it? Why AMD doesn't make them available as source code, or even through some precompiled binary under Linux? I have yet to see a GPU based H.264 decoder working with shaders only. Intel is trying though I doubt we will get decent performance.

                      The advantage of using UVD is that we have full decode at VLD level and much more efficient power usage. Otherwise, the gain is not very interesting. e.g. instead of using 90W CPUs, we would still use a certain portion of it + 130W GPU with a shaders approach.

                      Comment

                      Working...
                      X