Announcement

Collapse
No announcement yet.

Will AMD's XvBA Beat Out NVIDIA's VDPAU?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Silent Storm View Post
    I didn't read all the messages but I've seen a "nVidia's headstart" argument. Unfortunately ATI had it first, literally years ago, in Radeon 8500 era. It was called VideoShaders and VideoSoap. Did anyone hear it? It's unlikely because nobody except ATI tech demos used it. It was a CPU independent video acceleration and post-processing pipeline and worked very well.
    I said that. I should have clarified myself more. What I meant was that Nvidia had a headstart when it came to hardware decoding on UNIX-like operating systems.

    Comment


    • #32
      OT:
      Am I the only one who read the blog entry and found that: "The Poulsbo (US15W) video driver may be Open Source'd by Q4 2009"??

      well that's even better!

      Comment


      • #33
        @bridgman

        When XvBA works so well why didn't ATI ship headers to use it? Also some programming examples would not hurt too...

        Comment


        • #34
          Originally posted by thefirstm View Post
          I said that. I should have clarified myself more. What I meant was that Nvidia had a headstart when it came to hardware decoding on UNIX-like operating systems.
          ATI had GPU acceleration before in Linux in 9600 and X1600 era but they dropped it to re-write it from scratch. It was not literally HW decoding but it accelerated most of the video and enabled 720P videos over Xv with reasonable CPUs like high end athlon XPs and first gen Athlon64s

          What you are right about is the adoption and marketing side of it, which by no means hinder the truth value of your argument. Even the technical side is different, the real situation is again reflected by your argument.

          I wanted to point that the fact of being better and being first doesn't always bring victory but instead of writing it directly, I injected the whole idea to the post.

          Comment


          • #35
            Originally posted by deanjo View Post
            Take the one that is most mature and complete and run with it.
            That's exactly why I chose and now stick to VA API.

            Originally posted by deanjo View Post
            One thing the tests also did not reveal is if cpu's stayed in their lowest power state or not, if any frames were dropped. Without a timeline graph the results or at least a frame count and played frame count the results are very incomplete.
            The initial testing used the -frames 1000 option, but I changed the procedure when I noticed only the Poulsbo dropped a few frames with the Riddick video (and only for this video). Dynamic frequency scaling was also enabled so the CPUs were at their lowest frequency: 800 MHz for the Atom CPU, 1 GHz for the Phenom CPU. Then, I redid the test in "performance" mode to have interesting measures wrt. Xv.

            Originally posted by Kano View Post
            The Benchmarks are really funny. Did somebody notice that he used a "Mobility Radeon HD 4870 - 550 MHz, 1 GB" together with a "Phenom 8450"?
            Do you know about MXM-to-PCIe adapters? Yes, I also had tested a GTX 280M in the same box, but the results were not conclusive enough since the GPU was not running at its highest frequency. So, I didn't publish the figures, though you can get them in the tarball. They won't represent the real G92 core capabilities though.

            Comment


            • #36
              Originally posted by gbeauche View Post
              That's exactly why I chose and now stick to VA API.
              I'd rather consider VDPAU to be the most mature API. It is widely supported by applications, works well (on NVidia GPUs) and is very well-documented. Currently it's the only video decoding API which has an implementation that "just works" *now* and without major fuss.

              VA-API on the other hand is badly documented, if at all, and seems to be missing out some of the functionality provided by VDPAU.
              Last edited by greg; 07 July 2009, 07:48 AM.

              Comment


              • #37
                Originally posted by greg View Post
                I'd rather consider VDPAU to be the most mature API. It is widely supported by applications, works well (on NVidia GPUs, no idea about S3) and is very well-documented. Currently it's the only video decoding API which has an implementation that "just works" *now* and without major fuss.

                VA-API on the other hand is badly documented, if at all, and seems to be missing out some of the functionality provided by VDPAU.
                S3 doesn't use VDPAU.

                The question was about mature and complete. VA API is just that:
                - complete: supports more codecs and video encode acceleration
                - mature: well, it has been around for a long time, though implementations were not public. There are at least 4 (if not 5), "native" implementations, i.e. real drivers, not counting my bridges.

                Now, as I said, applications support is weaker due to initial lack of drivers, but it's as trivial to add as for VDPAU. So, this can change quite easily.

                Comment


                • #38
                  GEM vs TTM

                  XvBA vs X-Video vs VDPAU vs VA-API

                  ...

                  When will this API/subsystem nightmare end? Please make a unified API for hardware video decoding, this is a pain in the ass...

                  Comment


                  • #39
                    Originally posted by gbeauche View Post
                    S3 doesn't use VDPAU.
                    Wrong.

                    RELEASE HISTORY

                    06/26/2009: Version 14.02.17
                    - Bug Fixes
                    - XRandR support
                    - VDPAU support
                    - KMS Support

                    The S3 Graphics Accelerated Linux Driver Set support:

                    * Linux Kernel 2.6.x
                    * X.Org X11R7.x with H/W 2D acceleration through XAA or EXA
                    * SAMM / MAMM / Xinerama with multiple display
                    * DVI dual-link up to 2560x1600 resolution
                    * 90/180/270 degree display rotation
                    * H/W accelerated direct-rendering OpenGL 3.0 API
                    * H/W accelerated indirect-rendering OpenGL 2.1 API
                    * Composite Desktop with AIGLX / Compiz
                    * Full featured RandR 1.2 function
                    * Kernel mode setting with standalone module
                    * Full H.264, VC-1, WMV9 and MPEG-2 VLD bitstream H/W decoding
                    through VDPAU
                    or VA-API driver

                    This README describes how to install, configure, and use the S3 Graphics
                    Accelerated Linux Driver Set.

                    http://drivers.s3graphics.com/en/dow...N_Linux_EN.txt
                    Last edited by deanjo; 07 July 2009, 08:34 AM. Reason: Highlighted for the blind

                    Comment


                    • #40
                      Originally posted by MostAwesomeDude View Post

                      And no, I don't think anybody on the open-source side wants to do any more split video decoding backends. Let's just do everything on Gallium and be happy with it.

                      ~ C.
                      Apparently there was a SOC2009 idea for VDPAU via Gallium:


                      I don't know if it materialized, but this is _really_ the way to go!

                      Comment

                      Working...
                      X