Announcement

Collapse
No announcement yet.

Will AMD's XvBA Beat Out NVIDIA's VDPAU?

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Silent Storm View Post
    I didn't read all the messages but I've seen a "nVidia's headstart" argument. Unfortunately ATI had it first, literally years ago, in Radeon 8500 era. It was called VideoShaders and VideoSoap. Did anyone hear it? It's unlikely because nobody except ATI tech demos used it. It was a CPU independent video acceleration and post-processing pipeline and worked very well.
    I said that. I should have clarified myself more. What I meant was that Nvidia had a headstart when it came to hardware decoding on UNIX-like operating systems.

    Comment


    • #32
      OT:
      Am I the only one who read the blog entry and found that: "The Poulsbo (US15W) video driver may be Open Source'd by Q4 2009"??

      well that's even better!

      Comment


      • #33
        @bridgman

        When XvBA works so well why didn't ATI ship headers to use it? Also some programming examples would not hurt too...

        Comment


        • #34
          Originally posted by thefirstm View Post
          I said that. I should have clarified myself more. What I meant was that Nvidia had a headstart when it came to hardware decoding on UNIX-like operating systems.
          ATI had GPU acceleration before in Linux in 9600 and X1600 era but they dropped it to re-write it from scratch. It was not literally HW decoding but it accelerated most of the video and enabled 720P videos over Xv with reasonable CPUs like high end athlon XPs and first gen Athlon64s

          What you are right about is the adoption and marketing side of it, which by no means hinder the truth value of your argument. Even the technical side is different, the real situation is again reflected by your argument.

          I wanted to point that the fact of being better and being first doesn't always bring victory but instead of writing it directly, I injected the whole idea to the post.

          Comment


          • #35
            Originally posted by deanjo View Post
            Take the one that is most mature and complete and run with it.
            That's exactly why I chose and now stick to VA API.

            Originally posted by deanjo View Post
            One thing the tests also did not reveal is if cpu's stayed in their lowest power state or not, if any frames were dropped. Without a timeline graph the results or at least a frame count and played frame count the results are very incomplete.
            The initial testing used the -frames 1000 option, but I changed the procedure when I noticed only the Poulsbo dropped a few frames with the Riddick video (and only for this video). Dynamic frequency scaling was also enabled so the CPUs were at their lowest frequency: 800 MHz for the Atom CPU, 1 GHz for the Phenom CPU. Then, I redid the test in "performance" mode to have interesting measures wrt. Xv.

            Originally posted by Kano View Post
            The Benchmarks are really funny. Did somebody notice that he used a "Mobility Radeon HD 4870 - 550 MHz, 1 GB" together with a "Phenom 8450"?
            Do you know about MXM-to-PCIe adapters? Yes, I also had tested a GTX 280M in the same box, but the results were not conclusive enough since the GPU was not running at its highest frequency. So, I didn't publish the figures, though you can get them in the tarball. They won't represent the real G92 core capabilities though.

            Comment


            • #36
              Originally posted by gbeauche View Post
              That's exactly why I chose and now stick to VA API.
              I'd rather consider VDPAU to be the most mature API. It is widely supported by applications, works well (on NVidia GPUs) and is very well-documented. Currently it's the only video decoding API which has an implementation that "just works" *now* and without major fuss.

              VA-API on the other hand is badly documented, if at all, and seems to be missing out some of the functionality provided by VDPAU.
              Last edited by greg; 07-07-2009, 07:48 AM.

              Comment


              • #37
                Originally posted by greg View Post
                I'd rather consider VDPAU to be the most mature API. It is widely supported by applications, works well (on NVidia GPUs, no idea about S3) and is very well-documented. Currently it's the only video decoding API which has an implementation that "just works" *now* and without major fuss.

                VA-API on the other hand is badly documented, if at all, and seems to be missing out some of the functionality provided by VDPAU.
                S3 doesn't use VDPAU.

                The question was about mature and complete. VA API is just that:
                - complete: supports more codecs and video encode acceleration
                - mature: well, it has been around for a long time, though implementations were not public. There are at least 4 (if not 5), "native" implementations, i.e. real drivers, not counting my bridges.

                Now, as I said, applications support is weaker due to initial lack of drivers, but it's as trivial to add as for VDPAU. So, this can change quite easily.

                Comment


                • #38
                  GEM vs TTM

                  XvBA vs X-Video vs VDPAU vs VA-API

                  ...

                  When will this API/subsystem nightmare end? Please make a unified API for hardware video decoding, this is a pain in the ass...

                  Comment


                  • #39
                    Originally posted by gbeauche View Post
                    S3 doesn't use VDPAU.
                    Wrong.

                    RELEASE HISTORY

                    06/26/2009: Version 14.02.17
                    - Bug Fixes
                    - XRandR support
                    - VDPAU support
                    - KMS Support

                    The S3 Graphics Accelerated Linux Driver Set support:

                    * Linux Kernel 2.6.x
                    * X.Org X11R7.x with H/W 2D acceleration through XAA or EXA
                    * SAMM / MAMM / Xinerama with multiple display
                    * DVI dual-link up to 2560x1600 resolution
                    * 90/180/270 degree display rotation
                    * H/W accelerated direct-rendering OpenGL 3.0 API
                    * H/W accelerated indirect-rendering OpenGL 2.1 API
                    * Composite Desktop with AIGLX / Compiz
                    * Full featured RandR 1.2 function
                    * Kernel mode setting with standalone module
                    * Full H.264, VC-1, WMV9 and MPEG-2 VLD bitstream H/W decoding
                    through VDPAU
                    or VA-API driver

                    This README describes how to install, configure, and use the S3 Graphics
                    Accelerated Linux Driver Set.

                    http://drivers.s3graphics.com/en/dow...N_Linux_EN.txt
                    Last edited by deanjo; 07-07-2009, 08:34 AM. Reason: Highlighted for the blind

                    Comment


                    • #40
                      Originally posted by MostAwesomeDude View Post

                      And no, I don't think anybody on the open-source side wants to do any more split video decoding backends. Let's just do everything on Gallium and be happy with it.

                      ~ C.
                      Apparently there was a SOC2009 idea for VDPAU via Gallium:
                      http://xorg.freedesktop.org/wiki/SummerOfCodeIdeas

                      I don't know if it materialized, but this is _really_ the way to go!

                      Comment


                      • #41
                        Originally posted by Silent Storm View Post
                        It was called VideoShaders and VideoSoap. Did anyone hear it? It's unlikely because nobody except ATI tech demos used it. It was a CPU independent video acceleration and post-processing pipeline and worked very well.
                        Since Geforce FX (maybe even before, not so sure), there was also some hardware video decoding acceleration through XvMC, and has always been here, until Geforce 8, which got VDPAU, and as far as i remember (that was a long time ago), the video decoding was almost totally dischard to the GPU (on a athlon 2500+ i had less than 10% of CPU activity) for MPEG2, others codecs could have been also accelerated if someone ever done some code to use this motion compensation in it, but noone ever did :/

                        Comment


                        • #42
                          Once again Phoronix and its lame AMD articles....

                          Comment


                          • #43
                            Originally posted by phhusson View Post
                            Since Geforce FX (maybe even before, not so sure), there was also some hardware video decoding acceleration through XvMC, and has always been here, until Geforce 8, which got VDPAU, and as far as i remember (that was a long time ago), the video decoding was almost totally dischard to the GPU (on a athlon 2500+ i had less than 10% of CPU activity) for MPEG2, others codecs could have been also accelerated if someone ever done some code to use this motion compensation in it, but noone ever did :/
                            GeForce FX was the first family that came with the ShaderFX and CinemaFX (or CineFX) engines, you are right, but 8500 family is older than FX family IIRC. Also Rage64Pro had a hardware DVD decoder which Creative made some fortune by selling as an additional card (which was different than ATI's anyway).

                            Whatever, this is not a flame war. My point was that ATI is consistently the first company to innovate things and to fail since they're unable to market and use it correctly.

                            Comment


                            • #44
                              Originally posted by bulletxt View Post
                              Once again Phoronix and its lame AMD articles....
                              +1

                              This "In the future there will be robots" kind of news is getting old. By the time XvBA is oficially published there would be a new X.org version and, guess what? It won't be supported by the Catalyst driver so we'll be stuck with the open source drivers.

                              Comment


                              • #45
                                Originally posted by Silent Storm View Post
                                Also Rage64Pro had a hardware DVD decoder which Creative made some fortune by selling as an additional card (which was different than ATI's anyway).
                                Creative didn't use a Rage64 based card for it's DVD decoder card. It was a clone of the Sigma Designs Hollywood +.

                                Comment

                                Working...
                                X