Announcement

Collapse
No announcement yet.

MythTV Adds Support For NVIDIA VDPAU

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Yep, I think you have all the companies right. Some of the volunteering is individual (more than you might think), and some is corporate, but no one company runs the show.

    I guess the point I'm trying to make is that everyone pitches in where they can and we all try to coordinate along the way, in contrast to a normal "managed" development effort where deliverables are identified, effort is estimated, schedules are set, resources are allocated, tasks are doled out, and one or more people oversee the execution against a detailed, published plan.

    That kind of management stuff doesn't seem to go over so well in the open source world. You'd think we were killing kittens or something

    Anyways, nice talking to you.
    Last edited by bridgman; 01 December 2008, 04:53 AM.
    Test signature

    Comment


    • #42
      Originally posted by deanjo View Post
      I'm sorry, but there are plenty of bugs in oss both old and new and they have yet to prove that they will be fixed in any more timely manner then their closed source cousins. Their track record is just as shaky as closed source. The quality of code really has very little to do with it's preferred ideals of oss/closed but more with the competence of the coders doing it . Nvidia has constantly proven that they are up to the task.
      Sure you can find bugs in both worlds. The thing is ultimately I believe you can find more non fixed bugs in the closed source drivers/applications. Sure you can find examples of good closed source apps and bad open source ones, but generally the treand is the other way around. You know the "Many Eyes Makes Bugs Shallow" mantra, but I think it's not just that. When you make your code public you generally want to make it as good as you can to not be emarassed about it latter on. Coding as closed source allows some sloppy programming as noone besides you and your buds from the office will see the code.

      You say Nvidia should be presured to change their stance, why so it can enjoy the constant hell of X development, lose functionality and put faith in a crew that constantly has to go back to the drawing board to come up with alternative solutions on a monthly basis?
      I think the current state of X is partially nvidia fault. Releasing a good closed driver that bypasses many X mechanisms made many people to not care about things like DRI2 for example. Without good nvidia driver X might have got many programers that right now do not care about it because nvidia is working just fine for them.
      Few years ago the situation was that nvidia was really the only choice for a Linux user. That or some old radeon 9200. I think many companies at some point fallowed nvidia example and provided blobs, because it was working so well for nvidia.... but you know so far those companies failed in providing as good blob as nvidia does. If not for this bad exemple maybe we would have had better open source drivers right now.

      There is also long outstanding bugs (thinking of S3 and SiS chips from the P2/P3 days that are at least 3 years old ) on older hardware as well but because nobody really uses them anymore those bugs are left unfixed probably because of the low priority so you can't argue that foss guarantee's ongoing support either.
      Well those cards are ancient, so no surprise. How many years have passed since the closed drivers for those were not updated ?
      On the other hand radeon 9200 or even r100 cards still have nice support and get fixes and new stuff, while my geforce 4200 is in the legacy nvidia tree. You know that geforce is really fine for all things you can do on that box and I do not need to upgrade it, but I get almost no support for it.

      Comment


      • #43
        Originally posted by val-gaav View Post
        Sure you can find bugs in both worlds. The thing is ultimately I believe you can find more non fixed bugs in the closed source drivers/applications. Sure you can find examples of good closed source apps and bad open source ones, but generally the treand is the other way around.
        100% unproven and without basis.

        You know the "Many Eyes Makes Bugs Shallow" mantra, but I think it's not just that. When you make your code public you generally want to make it as good as you can to not be emarassed about it latter on. Coding as closed source allows some sloppy programming as noone besides you and your buds from the office will see the code.
        There are millions and millions of lines of ugly code in the foss world. There is also pressure on closed source devs, maybe even more, to make good code. A project lead is only going to accept so much crap code before he tells the programmer to hit the road looking for another job. The assumption that companies hire morons for closed code is completely unjustified in the real world.

        I think the current state of X is partially nvidia fault. Releasing a good closed driver that bypasses many X mechanisms made many people to not care about things like DRI2 for example. Without good nvidia driver X might have got many programers that right now do not care about it because nvidia is working just fine for them.
        Few years ago the situation was that nvidia was really the only choice for a Linux user. That or some old radeon 9200. I think many companies at some point fallowed nvidia example and provided blobs, because it was working so well for nvidia.... but you know so far those companies failed in providing as good blob as nvidia does. If not for this bad exemple maybe we would have had better open source drivers right now.
        lol, seriously, so Nvidia is responsible for foss development laziness and other companies inept attempts at bringing a working solution? The "We suck because NV is so good" is a really weak attempt at justifying the poor state of x. Heck bridgeman has even given examples where even with all the resources needed for XvMC support being made public there is still no interest from foss devels picking it up and implementing it.


        Well those cards are ancient, so no surprise. How many years have passed since the closed drivers for those were not updated ?
        On the other hand radeon 9200 or even r100 cards still have nice support and get fixes and new stuff, while my geforce 4200 is in the legacy nvidia tree. You know that geforce is really fine for all things you can do on that box and I do not need to upgrade it, but I get almost no support for it.
        So what if the 4200 is in the legacy tree? It's blobs are still regularly updated. Legacy does not mean forgotten or unsupported. Hell even the orginal TNT which is older then your radeons are still being updated which is more than can be said of other cards from it's era. Last driver available for them was put out 1 month ago. The point being is that the arguement that foss drivers makes sure that there will be ongoing support again "works in theory" but in real life the story is very different.

        Comment


        • #44
          Originally posted by RobBrownNZ View Post
          Bridgman-
          The 780G point comes from this very site, which said that the AMD UVD stuff would work with UVD2 only. From what I can divine, 780G is UVD(1) and so won't be supported.
          I was also under the impression that it is UVD1 only, wikipedia (yes yes yes) also says so and I was searching amd.com and could not find it clearly stated.
          Now that might be me but it nearly had me buying an nvidia board - again not sure what I am going to do...

          Comment


          • #45
            If we get into the details we just end up confusing everyone, so the amd.com blurb tends to talk about what the chip can do rather than which specific version of UVD (or 3D engine, or display controller, or...) is included.

            The Wikipedia UVD page seems to be just plain wrong -- it says on the page that we use UVD+ in 780 (I have never heard of UVD+) but the link it references for that statement says that 780 uses UVD2.

            If you follow the links you get :

            "AMD ATi UVD2 的顯示卡有 3450 3470 3650 3670
            AMD ATi UVD 的顯示卡有 2400 2600 3850 3870
            AMD ATi UVD2 的 IGP MB 有 780G 780GX"

            AFAIK this is wrong as well, but less wrong than the Wikipedia page. My understanding was that :

            - 2300 (rv550), 2400 (rv610), 2600 (rv630), 34xx (rv620), 36xx (rv635), 38xx (rv670) all have UVD1
            - 2900 does not have UVD
            - 3100-3300 (all the variants in 780/790GX family) have UVD2
            - 4xxx have UVD2

            There were incremental improvements along the way in both UVD1 and UVD2 so there are actually more than 2 versions of UVD, but the UVD1/UVD2 split covers the main architectural changes.

            It's possible that someone leaked early info about the 780 and talked about "UVD+" rather than "UVD2", and that the original site was updated after the info was transcribed to Wikipedia. I'm just guessing though, based on the fact that there are two rows of "UVD2" parts so the last row might have originally said "UVD+". We don't spend time running around correcting leaks and rumours though -- there's too much other work to do first



            We have not announced anything related to video support with fglrx, and we have no intention of announcing anything that is not ready to use. In the meantime, please let me remind everyone that my comments about UVD2 possibly having a better chance of open source support than UVD1 (because of some internal differences) only relate to open source support and not to anything we might do in fglrx.
            Last edited by bridgman; 01 December 2008, 04:49 PM.
            Test signature

            Comment


            • #46
              bridgman, that's a great help. Thanks!

              Now, if you could just let us know when the UVD2 stuff in fglrx will be ready for use in MythTV...

              Comment


              • #47
                Originally posted by RobBrownNZ View Post
                bridgman, that's a great help. Thanks!

                Now, if you could just let us know when the UVD2 stuff in fglrx will be ready for use in MythTV...
                Yeah that is the million dollar question right now. Well, perhaps not a million dollars but about $80-$100 for an ok low/mid-end discrete jack of all trades card from either maker.

                Damn you bridgman for being so polite, wise and patient in these forums! I can't even read a thread about an actual working HD implementation from Nvidia without seriously considering getting an ATI card instead, after years of suffering from bad fglrx syndrome I am pretty sure your boss is not paying you enough
                Last edited by korpenkraxar; 01 December 2008, 08:38 PM.

                Comment


                • #48
                  Originally posted by korpenkraxar View Post
                  Yeah that is the million dollar question right now. Well, perhaps not a million dollars but about $80-$100 for an ok low/mid-end discrete jack off all trades card from either maker.

                  Damn you bridgman for being so polite, wise and patient in these forums! I can't even read a thread about an actual working HD implementation from Nvidia without seriously considering getting an ATI card instead, after years of suffering from bad fglrx illness I am pretty sure your boss is not paying you enough
                  or a 50gbp motherboard ....

                  Comment


                  • #49
                    Sorry about mucking up the NVidia thread
                    Test signature

                    Comment


                    • #50
                      I don't understand the whole attitude of pointing fingers towards AMD/ATI. It's like everyone's expecting a complete FOSS driver to be delivered by them, when in fact they've only promised to release documention which they've done well on. They never said anything about implementing the open source drivers. flgrx is already fully featured minus the video acceleration part.

                      Now as far as HW video decode acceleration goes, it's great that nvidia has a working solution. I don't see too much of a problem around a vendor specific interface because if and when the "community" decides on a non-proprietary API, they'll probably just provide a wrapper around VDPAU.

                      But HW decode though has a drawback in general -- you can't do software driven post-decode processing, at least not yet on any platform AFAIK.

                      Comment

                      Working...
                      X