Announcement

Collapse
No announcement yet.

AMD's UVD2-based XvBA Finally Does Something On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by Qaridarium
    in fakt Dirac is stronger/better than H264 less datarate in the same quality or better quality in the same datarate.
    I don't think any sane person can agree on this one...

    Comment


    • Originally posted by Qaridarium
      first Dirac acceleration on opensource side will doom the UVD out..
      There is already Dirac acceleration but this codec is way inferior to H.264.

      Comment


      • Originally posted by gbeauche View Post
        There is already Dirac acceleration but this codec is way inferior to H.264.
        Exactly, I don't think there has ever been one comparison that has shown dirac to be superior to H.264 in any way shape or form.



        etill.net is your first and best source for all of the information you’re looking for. From general topics to more of what you would expect to find here, etill.net has it all. We hope you find what you are searching for!

        Comment


        • Gosh the fanboys from both nvidia & ati are jumping up and down here. Get over it people. Or at least back up claims with proper data.

          Comment


          • Originally posted by Qaridarium
            Ironic on: 'O yes and i'm the King of china' ironic off

            software Engineering is not the same as electronic overheating chip thats destroy customers notebook.

            your part is software nvidia fail part is elecronic you are not well knows in that one.

            in software you fail to: Apple sould kick you for the macos 10.6 openGL fullscreen BUG.
            Where do you think those fan speeds are kept? It ain't in the software.

            1900xt overheats and dies, hmm sounds familar doesn't it?
            Last edited by deanjo; 04 November 2009, 04:24 AM.

            Comment


            • Originally posted by Qaridarium
              your argumenation is FAIL....

              your sources only DOOOMED by nvidia APPLE intern notes are no sources because no one can check the true because no one can see the REAL in the real intern Apple companie...

              no one can check your lie..


              this stupid dog should eat you!
              3ghz quad core machine, 8gb ram, ati x1900 video card. When playing for longer periods at high resolution setting the video starts to get corrupted. This is temp related. If I escape to the menu (no 3d graphics) for 10 seconds it will go away (for awhile). I downloaded the SMC fan...






              You fail my friend, you fail (just like X1900's)

              Comment


              • Originally posted by Qaridarium
                you fail.. X1900+mac,,,, thats mean standart pc as no problem with an X1900
                Same bloody card with EFI support. You use a Mac as an example I'll post one right back at ya. In fact those cards were manufactured by ATI as well and not even a 3rd party partner. Genuine ATI. Not Sparkle, not xfx, not saphirre....

                Comment


                • Originally posted by Qaridarium
                  http://en.wikipedia.org/wiki/GeForce_8_Series#Problems

                  "Some chips of the GeForce 8 series (concretely those from the G84 and G86 series) may suffer from an overheating problem. NVIDIA states this issue should not affect many chips,[43] whereas others assert that all of the chips in these series are potentially affected.[43] NVIDIA CEO Jen-Hsun Huang and CFO Marvin Burkett were involved in a lawsuit filed on September 9, 2008 alleging that their knowledge of the flaw, and their intent to hide it, resulted in NVIDIA losing 31% on the stock markets.[44] As a result of this problem, some OEMs[which?] have started using other chipset makers like ATI and are no longer using NVIDIA chips in their newer models."
                  OK, I love wikipedia quotes but really what are you trying to prove? I never said that bumpgate never happened. In fact I fully acknowledged it. Ask yourself this, why did Apple switch to Nvidia chipsets for the CPU's that nvidia can make chipsets for? I'm simply stating that ATI has had their share of hardware issues as well and over a long series of macs more ATI cards have been replaced then nvidia cards. This still remains true despite the bumpgate fiasco.

                  Comment


                  • Originally posted by Qaridarium
                    Nvidia feils on technic side to:



                    its german but nvidia has big Vram-bug error in the chip FAIL FAIL FAIL!
                    Vram bug was fixed in the drivers since the 160 series of drivers, sorry to burst your bubble.

                    Comment


                    • Originally posted by Qaridarium
                      i can not find any critik @ wikipedia on amd cards: http://en.wikipedia.org/wiki/Radeon_R700

                      http://en.wikipedia.org/wiki/Evergreen_%28GPU_family%29



                      i can not find any critik @ wikipedia on amd cards

                      @wikipedia only nvidia fail faill failll fail fail fail

                      I can surely add those in if you want.

                      Such as this article:

                      Some ATI-based graphics cards made by Diamond were reported to fail at higher than expected rates


                      Last edited by deanjo; 04 November 2009, 04:58 AM.

                      Comment

                      Working...
                      X