Announcement

Collapse
No announcement yet.

AMD's UVD2-based XvBA Finally Does Something On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Learn reading: 4550! 80w passively cooled thats of course something that only Q manages.

    Comment


    • Q's point that modern systems don't really need video acceleration except to save power is a good one. So is Kano's that fglrx sucks for home users. I'm not sure even AMD would dispute that, it's the whole reason they are supporting the OSS drivers.

      Comment


      • Oh, and i never understood AMD's insistence on doing monthly driver releases either. This isn't just about linux, but also the windows drivers. Why not just do them quarterly, with perhaps an extra one every now and then when new hardware is released or something major changes? It just doesn't seem to be a problem for any other hardware company on earth, except for amd which insists terrible things would happen if they changed their process. It makes you wonder just how much money and resources they spend on QA and releasing so many drivers, and what they could have done in the driver instead if they had refocused some of that energy.

        Comment


        • Originally posted by smitty3268 View Post
          Q's point that modern systems don't really need video acceleration except to save power is a good one. So is Kano's that fglrx sucks for home users. I'm not sure even AMD would dispute that, it's the whole reason they are supporting the OSS drivers.
          I disagree. I think the reason that AMD supports the OSS drivers is to encourage open-source development and innovation, and to get the one-up on Nvidia. Their open-source driver is.....where?

          Comment


          • Nvidia has got legacy drivers, just not for the oldest series 71.xx for new Xservers. Ok, thats not nice, but those cards are at least 10 y old. ATI legacy cards are R300-R500, and those were even sold in new systems at the time of being legacy - the oss driver at that time was not really in good shape when the cards have been dropped from fglrx. Certainly nv could do more for the oss drivers, it is a shame that fermi cards will not even get modesetting support in nv but in the next 5y these cards will be most likely better supported than anything from other companies. As long as you don't have em in a laptop a replacement should be relatively easy possible until pci-e is outdated - so you are not really depending on long term support.

            Comment


            • I agree with Kano, 100%. To say, the priority is workstation cards or workstation support is to neglect mainstream consumers/customers and throw away an opportunity with regards to Nvidia. Maybe that is not a concern or of any consideration. But, there are probably a lot of users would prefer a driver encompass a wide variety of uses or cover the entire potential use of the card.

              Linux users might want a high performance card and maybe they dual boot. So, an expensive card won't be supported 100% or teh priority won't be there?

              Saying there is OSS support is nice but not a major factor for the average customer! Besides, there's even a report on here that presents the OSS driver support of a current state being woefully inferior to fglrx driver performance of something like FOUR YEARS AGO. Most people will use what works and offers optimal performance whether it's 3D, video play, whatever. If Windows and Workstations, then admit it so users can decide if they want to get a Nvidia card or dual boot for a while until support improves enough.

              As for the latest claims that the ATI hardware is superior and doesn't run as hot with lower temps. That's true. That is one reason I wanted a newer ATI card. But, this is only an advantage if you're often dual booting into Windows. If the fglrx drivers are so shoddy and the OSS drivers are always late and not optimized, then the advantages probably won't be there or work fully. I will hate to do it but I'll look at the GT 240 since it's cheap and I'll then wait to see what happens on the ATI side. I also use video a lot and video issues would be frustrating as it's probably a chore just to install the drivers in many cases.

              Comment


              • My card is a LOWER clocked gt220, the one tested is a OC card. You must have dreamt that your 4670 needs less power. Where did you see 16w for your card?

                http://www.tomshardware.com/reviews/...0,2445-16.html

                Comment


                • Originally posted by Kano View Post
                  Nvidia has got legacy drivers, just not for the oldest series 71.xx for new Xservers. Ok, thats not nice, but those cards are at least 10 y old. ATI legacy cards are R300-R500, and those were even sold in new systems at the time of being legacy - the oss driver at that time was not really in good shape when the cards have been dropped from fglrx. Certainly nv could do more for the oss drivers, it is a shame that fermi cards will not even get modesetting support in nv but in the next 5y these cards will be most likely better supported than anything from other companies. As long as you don't have em in a laptop a replacement should be relatively easy possible until pci-e is outdated - so you are not really depending on long term support.
                  Actually Kano I've just recently ran 10 year old nVidia cards on Ubuntu 9.10 with their binary drivers. That kind of support does seem to put ATI well behind in terms of legacy support from binary drivers.

                  For instance when using an nVidia GeForece2 MX-400 Ubuntu 9.10 and 10.04 Beta will happily install the nVidia blob for you and that works pretty darn well (as ar as that card can at least )

                  Comment


                  • Originally posted by Qaridarium
                    FOSS in real means: do not sell your soul to the devil

                    use a cloused source driver or use a cloused source os means sell your soul to the devil.
                    Please tell me you're joking.

                    Comment


                    • Originally posted by Qaridarium
                      "except to save power"

                      this is just wrong phoronix thest this a GPU solution need MORE power consuming than a modern CPU!

                      you can save power consuming by using the CPU!!!


                      GPU acceleration is a pur Lie! a nvidia marketing Lie!
                      Either the test is broken (bad config, bad reference, etc.) or the GPU is broken. Reality is this is a power saving solution: the Poulsbo decoder is around 200 mW. When overlay is used and 3D engine not used, the latter is simply shut down.

                      Comment

                      Working...
                      X