Announcement

Collapse
No announcement yet.

AMD Catalyst 8.5 For Linux

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by oblivious_maximus View Post
    Yeah sorry, I really was flabbergasted.I think possibly, yes. Maybe I'm mistaken, but wouldn't a majority of the testing done on Debian be applicable to Ubuntu as well? I know Canonical makes a lot of changes but they still get all(nearly all?) their packages from sid, and most of Canonical's changes eventually make it back into Debian. And of course there are more distros based on Debian than just Ubuntu - but there's also starting to be a few distros based on Ubuntu now too. I guess I'd be a lot more happy about the situation if you tested on at least one Debian-based distro. If that was Ubuntu I expect it would still be pretty relevant to Debian. IDK though, someone who knows more about the specific differences between Ubuntu and Debian could probably better estimate the value to Debian users of testing on Ubuntu, and vice versa. But like I said, I think it's pretty imperative that AMD is testing on at least one Debian-based distro.
    Agreed. When we were completely workstation focused it wasn't such an issue (according to our info RHEL and SLED make up 135% of the workstation market ) but now that we're focusing on consumer use as well we have to include a Debian-based system. That user stats seem to be so overwhelmingly biased towards Ubuntu that we figured Ubuntu testing would actually make for more happy users than Debian.

    Originally posted by oblivious_maximus View Post
    Unless Asus and their disappearing-from-the-M3A-bios-memory-remap-option is to blame for my issues, it now seems pretty clear that my HD3650 has been gathering dust for months because AMD has completely ignored an entire, major category of Linux distribution. For example, the bug in authatieventsd.sh where it doesn't point to /var/run/xauth - that's been known since at least 8-3, but with both 8-4 and 8-5 I had to manually patch it. If AMD was testing on a Debian(-based) system, this likely never would have been a bug to begin with (I assume).
    I'll have to go search for your issues and system configuration (I don't keep files on you all ) but there's no question that BIOS remapping is still causing a lot of hurt with 4GB or more. At first glance it seems that some BIOSes are saying memory is present in places where it really is not. A couple of people have recently reported good luck limiting the amount of memory used by the kernel to 3.5M or so; doesn't sound like an ideal solution but sure seems easier to test than pulling out DIMMs.

    Originally posted by oblivious_maximus View Post
    My problem with this statement is this - what about the individual customers who also spend their hard earned cash on your products? Certainly none of us spend in one go what the corporations running SLED and Red Hat do, but we are still your customers, despite the fact that we are an extremely difficult bunch to quanitfy*.
    Depends what you mean by the question. If you're saying "we should be doing testing on a Debian system now that we're targetting consumer users" then I think we're in violent agreement. If you're saying "no matter what oddball combination of system, distro and patches a customer runs you need to work on their system" that gets tricky because we reach a point of diminishing returns. Top priority is making sure we always run reliably on major out-of-box distros so at least every customer has a starting point and can easily find out what made the difference between working and not-working.

    Originally posted by oblivious_maximus View Post
    It seems to me that Red Hat, SUSE and Debian are the 3 main distros from which the majority of other distros are based. Only testing on 2 out of 3 just doesn't make any sense to me, regardless of what the majority of corporate customers use. It's like a 2-legged tripod... or something, I'm obviously not very good with analogies.
    2-legged tripod is good enough. Nope, we agree.

    Originally posted by oblivious_maximus View Post
    I have problems with that as well. I absolutely, wholeheartedly, unequivocally support your efforts releasing docs and fostering the open drivers - that's the main reason I bought an ATI card I didn't actually require. However until such a time as either of these drivers has something resembling feature parity (notwithstanding the DRM features [not the Direct Rendering Manager] which are likely to remain fglrx-only), fglrx should be aimed at anyone who has purchased your hardware(any which fglrx supports) with the intention of using it on Linux. Anyone with an R600 card and problems with fglrx is bascially S.O.L. right now. It would seem doubly so if they use a Debian-based distro.
    I think we will get R600 3D running on open source drivers before we get an installer smart enough to work with every distro variant and patch combination out there. That said, there are only a finite number of open source developers available to help debug system-specific issues and I'm already seeing a lot of users who don't want to run open source drivers because "they have to build them and that's a big pain".

    Originally posted by oblivious_maximus View Post
    I'm tempted to offer up the spare M3A motherboard (AMD770/SB600) I have if it might expedite getting a Debian-based test box up and running (and hopefully finding my issue's remedy). I don't have any of the other necessary parts for a system spare though... If your testing systems are somewhere other than Markham though I don't think it'll happen.
    We have lots of hardware, but thanks. Hardware is not the problem -- good testers are expensive no matter where they are.

    Originally posted by oblivious_maximus View Post
    *I had an idea while writing this post, it's something that would require quite a bit of work though I expect. Anyway here it is: Inside the boxes of ATI cards (channel partners would have to participate also) there could be a slip of paper asking "Do you intend to use this video card under a Linux-based operating system? If so please go to www.amd.com/linuxcounter and let us know to help us make our Linux software as excellent as it can be." At the website would be a short form where customers could enter the product they bought and its serial number(so as to prevent abuse), and the distribution they use. This could help AMD generate some hard data on who's using what and in what numbers. The card-in-the-box isn't strictly necessary either, as long as people know about the site and to submit their info. Anyway, just an idea.
    That's a good idea. There have been some attempts to collect that kind of information, but it's really hard to get the numbers matched up from different sources so the results aren't all that credible. We have a problem reporting tool for Windows (at least we used to) which collected some info about the ATI product and the OS then formatted up an email which could then feed into bug tracking systems, but I think a lot of folks didn't like even that level of "phoning home". Do you think enough people would bother that we would get a representative number ?

    So far the most credible numbers seem to come from third party web sites which track browser and OS type. They don't identify whether the graphics card is ATI or something else, but market share info is available separately and even if we assume we have 100% market share the numbers aren't that high.

    Originally posted by oblivious_maximus View Post
    I hope my frustration hasn't given the wrong impression. Despite it, I'm still a big supporter of AMD (always have been) and the work you cats are doing to provide an open driver, and the interaction/support you, specifically, provide here, bridgman. From what I've seen at nvnews.net while looking for solutions to the less-major issues I have with my Nvidia card, the Nvidia staff there don't do much but make excuses, and sow misinformation. Apart from taking bug reports that is (they're not all bad I guess). Dealing with the same issue driver after driver and reading about many other people doing the same with other issues is a recipe for aggravation though. Anyhow, I think I've posted more than enough here, thanks for being here to hear what we have to say bridgman, I'd be a lot more frustrated were it not for your insights into what goes on behind the scenes.
    Thanks

    Comment


    • Originally posted by chikazuku View Post
      The opensourcedriver manages 2D and XVideo fine, but 3D is limited to glxgears or it'll bring down the system.
      Dave and Alex pushed some pretty significant fixes for RS4xx recently; have you tried the open source drivers in the last couple of weeks ? You definitely need latest DRM and Mesa, not sure about radeon and X server.

      Here's one of the recent fixes : http://airlied.livejournal.com/59351.html
      Last edited by bridgman; 05-30-2008, 11:00 PM.

      Comment


      • I think we will get R600 3D running on open source drivers before we get an installer smart enough to work with every distro variant and patch combination out there. That said, there are only a finite number of open source developers available to help debug system-specific issues and I'm already seeing a lot of users who don't want to run open source drivers because "they have to build them and that's a big pain".
        Hi Bridgeman. I just wanted to add that may be true in the short term, but sooner rather then later (hopefully) the open source driver will get built with the xserver. I'm not sure how RPM based distro's, or how Apt based distro's do it, but with Gentoo it uses an environment variable called VIDEO_CARDS that allows you to set the video drivers you want to install along with the xserver. I'm positive that Ubuntu or Fedora, or SuSe must have some similar mechanism for installing the proper device drivers.

        All we have to do now is wait for the open source driver to mature. After that there really wont be any point in fglrx. I can understand it as a stopgap, but as long as MS enforces it's WHQL certification, fglrx will never be a viable option. Until ATi is able to put it's full might behind it, fglrx is effectively worthless in every possible way.

        I personally think they should stop all further development right now, and devote every person that is currently working on it into getting documentation and code out the open source community. When it all boils down to nothing in the end that really is the only viable choice they actually have. Devoting resources to fglrx is simply delaying the inevitable. And it's a shame too becouse that delay is only hurting themselves.
        Last edited by duby229; 05-30-2008, 11:17 PM.

        Comment


        • Originally posted by bridgman View Post
          That's a good idea. There have been some attempts to collect that kind of information, but it's really hard to get the numbers matched up from different sources so the results aren't all that credible. We have a problem reporting tool for Windows (at least we used to) which collected some info about the ATI product and the OS then formatted up an email which could then feed into bug tracking systems, but I think a lot of folks didn't like even that level of "phoning home". Do you think enough people would bother that we would get a representative number ?

          So far the most credible numbers seem to come from third party web sites which track browser and OS type. They don't identify whether the graphics card is ATI or something else, but market share info is available separately and even if we assume we have 100% market share the numbers aren't that high.
          Hi bridgeman. Sorry for being a pain. I just wanted to chime in here as well. Thanks for your time and patience...

          I can pretty much promise you beyond the shadow of a doubt that any of the market share numbers you get based on web browser and so on is totally incorrect. The problem is that most Linux boxes are servers, and great deal of them (even newer ones from last year) are using rage128 chips. I'd say that makes up the vast majority of the linux market share.

          The fact is that a significant linux market exists. But ATi has --not-- leveraged it. This may upset a few people, but I firmly believe that you can easily get away with using a low end radeon on a server board. It may be overkill, but it will have an effect on your bottom line. Especially considering the advances in virtualization technology. AMD will be releasing a new IOMMU that supports virtualizing IO in the near future. That in combination with the proliferation of Linux thin clients in various workstation environments makes a significant market for you to target.

          Additionally, even though desktop linux makes up a small minority of linux market share, the number of video cards being used in desktop linux all added up equates to millions of dollars, that are mostly going into nVidia's pockets. It may be small in comparison, but we are still talking about --millions-- of dollars.. If you guys took the time to get the drivers in order, and then --specifically-- marketed towards the linux community I could absolutely guarantee your revenues would increase by a worthwhile margin.

          Comment


          • Originally posted by duby229 View Post
            Hi Bridgeman. I just wanted to add that may be true in the short term, but sooner rather then later (hopefully) the open source driver will get built with the xserver. I'm not sure how RPM based distro's, or how Apt based distro's do it, but with Gentoo it uses an environment variable called VIDEO_CARDS that allows you to set the video drivers you want to install along with the xserver. I'm positive that Ubuntu or Fedora, or SuSe must have some similar mechanism for installing the proper device drivers.
            This is a tough one. Most users (and most distros) try to avoid updating the server if possible, because so many other bits need to match. There's a lot to be said for picking up an entire xorg release, which AFAIK includes all the drivers as well.

            That said, I don't actually remember seeing any update mechanism that allows you to pull down an entire xorg release set. Nuts, something else to go learn

            Originally posted by duby229 View Post
            All we have to do now is wait for the open source driver to mature. After that there really wont be any point in fglrx. I can understand it as a stopgap, but as long as MS enforces it's WHQL certification, fglrx will never be a viable option. Until ATi is able to put it's full might behind it, fglrx is effectively worthless in every possible way.
            Not sure I understand this. WHQL is a driver quality and conformance test. No test is perfect but in almost every case passing WHQL makes the driver better not worse. WHQL doesn't care much about OpenGL, of course, and we don't run WHQL on the Linux driver builds; I don't think the common code is adversely affected by having to pass WHQL on the Windows drivers though... am I not getting your point ?

            Originally posted by duby229 View Post
            I personally think they should stop all further development right now, and devote every person that is currently working on it into getting documentation and code out the open source community. When it all boils down to nothing in the end that really is the only viable choice they actually have. Devoting resources to fglrx is simply delaying the inevitable. And it's a shame too becouse that delay is only hurting themselves.
            Remember what the fglrx driver was initially written for -- workstation systems, generally OEM preloads with pre-installed drivers, where the requirement is maximum performance, ISV certification and specific (sometimes uncommon) OpenGL features. We do see open source drivers being used very widely, but there are three areas where the open source drivers are not going to be able to compete :

            - workstation graphics (this is a highly competitive market; nobody is going to dump their proprietary software into an open source driver)

            - gaming (same thing; proprietary software is a big part of the competitive advantage)

            - legal playback of BluRay etc; again, not possible with an open source driver stack

            For most other scenarios, I expect users will get an open source driver out-of-the-box and never bother to upgrade.

            Comment


            • Originally posted by duby229 View Post
              Hi bridgeman. Sorry for being a pain. I just wanted to chime in here as well. Thanks for your time and patience...
              I thought you chimed in 20 minutes ago

              Originally posted by duby229 View Post
              I can pretty much promise you beyond the shadow of a doubt that any of the market share numbers you get based on web browser and so on is totally incorrect. The problem is that most Linux boxes are servers, and great deal of them (even newer ones from last year) are using rage128 chips. I'd say that makes up the vast majority of the linux market share.
              Agreed, but server numbers don't really count. A lot of those servers already have Radeon 7000 or ES1000 graphics, and we have always supported those with open source drivers. We have a dedicated team for supporting server OEMs with open source graphics drivers, although they are starting to work on the newer open source drivers after all our server customers are happy

              Originally posted by duby229 View Post
              The fact is that a significant linux market exists. But ATi has --not-- leveraged it. This may upset a few people, but I firmly believe that you can easily get away with using a low end radeon on a server board. It may be overkill, but it will have an effect on your bottom line. Especially considering the advances in virtualization technology. AMD will be releasing a new IOMMU that supports virtualizing IO in the near future. That in combination with the proliferation of Linux thin clients in various workstation environments makes a significant market for you to target.
              Yep. Again, the Radeon 7000 is already in a lot of thin client systems -- I'm constantly surprised how many -- but again most of the thin client developers want to write, or at least modify their own drivers so we tend to either license source or steer them towards the open source drivers.

              Originally posted by duby229 View Post
              Additionally, even though desktop linux makes up a small minority of linux market share, the number of video cards being used in desktop linux all added up equates to millions of dollars, that are mostly going into nVidia's pockets. It may be small in comparison, but we are still talking about --millions-- of dollars.. If you guys took the time to get the drivers in order, and then --specifically-- marketed towards the linux community I could absolutely guarantee your revenues would increase by a worthwhile margin.

              Agreed. The problem is that even millions of dollars (unless it's a lot of millions ) doesn't go that far once you start thinking about margins vs. revenues. Most Linux users tend towards low end cards, and don't have a compelling need to upgrade frequently because of the limited gaming options other than running under Wine.

              Which leads me to a Friday night rant :

              "OK, so we had good Windows drivers but everyone wanted Linux drivers. Now we have Linux drivers, and what does everyone want to do with them ? Run Windows apps . Augggh !!!".

              Comment


              • Originally posted by bridgman View Post
                This is a tough one. Most users (and most distros) try to avoid updating the server if possible, because so many other bits need to match. There's a lot to be said for picking up an entire xorg release, which AFAIK includes all the drivers as well.

                That said, I don't actually remember seeing any update mechanism that allows you to pull down an entire xorg release set. Nuts, something else to go learn
                Once the drivers are mature and they work for that users hardware he wont have to worry about it. This is the goal that ATi should be working towards. Fully supporting each and every new device on launch day with open drivers. That way the user wont have to worry about it. Software gets updated as part of the regular update cycle. This is of course the ideal, and nobody here is expecting perfection. But you know some parity would be nice.

                Not sure I understand this. WHQL is a driver quality and conformance test. No test is perfect but in almost every case passing WHQL makes the driver better not worse. WHQL doesn't care much about OpenGL, of course, and we don't run WHQL on the Linux driver builds; I don't think the common code is adversely affected by having to pass WHQL on the Windows drivers though... am I not getting your point ?
                Oh come now. I dont believe that your that naive. You know exactly what WHQL certification requires. MS has already been convicted of it in the past. It's the very reason why years ago when nVidia tried to open source there drivers they had to retract the code before it got published.

                Remember what the fglrx driver was initially written for -- workstation systems, generally OEM preloads with pre-installed drivers, where the requirement is maximum performance, ISV certification and specific (sometimes uncommon) OpenGL features. We do see open source drivers being used very widely, but there are three areas where the open source drivers are not going to be able to compete :

                - workstation graphics (this is a highly competitive market; nobody is going to dump their proprietary software into an open source driver)

                - gaming (same thing; proprietary software is a big part of the competitive advantage)

                - legal playback of BluRay etc; again, not possible with an open source driver stack

                For most other scenarios, I expect users will get an open source driver out-of-the-box and never bother to upgrade.
                [/QUOTE]

                Dedicate more then just 2 guys then. If you want maximum performance give us more then 2 people working on the documentation and drivers. Take the people that are working on fglrx and dedicate them to the open source projects. I'm sure the guys working on DRM could use a bit of help right now. The guys working on radeon, and radeonhd could certainly use a hand.

                And saying that nobody is going to dump there project into an open source driver is cocky and arrogant at best. The bottom line fact is that open source projects produce superior products every single time. A fact that has been proven time and time again every single time it was tested. If anything they should be --encouraged-- to use open source code and projects as often as possible.

                As far as gaming goes, the the only limiting factor at this point is driver support. And that is entirely ATi's fault and nobody elses. Gaming on an open source driver could be and --will-- be just as good and better then what a closed driver will be capable of delivering. Sooner rather then later the closed driver will be surpassed. It's inevitable. It will happen.

                And as far as HDCP goes, screw them... It will be hacked.... It's inevitable. And it's gonna be done in such a way as to be completely 100% transparent. I've got more to say on it, but this isnt the time or the place. Suffice to say that it isnt your concern. Dont waste your time.

                Comment


                • Originally posted by duby229 View Post
                  Once the drivers are mature and they work for that users hardware he wont have to worry about it. This is the goal that ATi should be working towards. Fully supporting each and every new device on launch day with open drivers. That way the user wont have to worry about it. Software gets updated as part of the regular update cycle. This is of course the ideal, and nobody here is expecting perfection. But you know some parity would be nice.
                  You might be surprised how close we are to this already. Stay tuned.

                  Originally posted by duby229 View Post
                  Oh come now. I dont believe that your that naive. You know exactly what WHQL certification requires. MS has already been convicted of it in the past. It's the very reason why years ago when nVidia tried to open source there drivers they had to retract the code before it got published.
                  Sure, but I think you're talking about use of Microsoft IP in general not WHQL itself.

                  Originally posted by duby229 View Post
                  Dedicate more then just 2 guys then. If you want maximum performance give us more then 2 people working on the documentation and drivers. Take the people that are working on fglrx and dedicate them to the open source projects. I'm sure the guys working on DRM could use a bit of help right now. The guys working on radeon, and radeonhd could certainly use a hand.
                  We're at 4 now; it just takes time to hire and train. The downside is that while we're bringing in new people and getting them up to speed we aren't working on documentation and drivers, so there's a bit of a balancing thing going on.

                  Most of the work is actually in mesa and drm these days; now that radeonhd has DRI support it can start lighting up the 3D drivers just like radeon.

                  Originally posted by duby229 View Post
                  And saying that nobody is going to dump there project into an open source driver is cocky and arrogant at best. The bottom line fact is that open source projects produce superior products every single time. A fact that has been proven time and time again every single time it was tested. If anything they should be --encouraged-- to use open source code and projects as often as possible.
                  OK, let me say it differently

                  We are not going to dump our workstation technology into an open source driver. Our competitors are not likely to dump their workstation technology into an open source driver either. In cases where software technology is part of a company's competitive position opening up the code is not an automatic win.

                  I agree that you normally see benefits from open sourcing projects, but you need to consider that there is a selection process happening before the decision is made, ie in general projects which are expected to benefit from open sourcing get open sourced, and projects which are not expected to benefit from open sourcing do not get open sourced. In cases where software technology is part of a company's competitive position opening up the code is not an automatic win. What you typically end up with is a compromise where some bits are opened up and others are not.

                  We looked at the market requirements and concluded that a two driver approach was probably the best starting point. That doesn't stop us from treating parts of the fglrx driver as open source projects -- the bottom end of the kernel driver has been GPL'ed for a long time, and the per-distribution packaging scripts are open sourced as well (big thanks to everyone involved !).

                  Originally posted by duby229 View Post
                  As far as gaming goes, the the only limiting factor at this point is driver support. And that is entirely ATi's fault and nobody elses. Gaming on an open source driver could be and --will-- be just as good and better then what a closed driver will be capable of delivering. Sooner rather then later the closed driver will be surpassed. It's inevitable. It will happen.
                  I'm not so sure about this one. It's not a question of talent -- there are some *very* good developers working in open source -- but I haven't found a single open source developer willing to commit the time and (mind numbing) effort necessary to make a top-notch gaming or workstation driver. If you don't believe me go ask on any of the IRC channels where the top open source devs hang out. They'll all tell you the same thing -- "we *could* do it but it's not likely we'll ever have time".

                  Originally posted by duby229 View Post
                  And as far as HDCP goes, screw them... It will be hacked.... It's inevitable. And it's gonna be done in such a way as to be completely 100% transparent. I've got more to say on it, but this isnt the time or the place. Suffice to say that it isnt your concern. Dont waste your time.
                  Sure, I can point you to all kinds of good solutions for hacking HDCP. That's not the point (I can't believe I'm getting into a DRM discussion on Friday night ) -- we will implement legal BluRay playback only because a large customer is willing to buy a huge heap of chips if we do. You as Linux users need to decide what you want the future to look like -- if you want the kind of huge growth that everyone talks about you are going to have to deal with issues like DRM because the user mix will be completely different from what it is today. I'm not sure that's what everyone in the Linux community wants.

                  I don't know how this is going to play out -- for now I am just putting legal playback of protected video (yes, this is a country-specific thing) on the list of functions which so far we only know how to do with a closed-source driver.
                  Last edited by bridgman; 05-31-2008, 01:04 AM.

                  Comment


                  • RPMs still won't build on Fedora 8 x86_64

                    I previously mentioned (probably in the Catalyst 8.4 thread) that I still couldn't get packages for Fedora 8 x86_64 to build, and someone else said they built fine, so I thought perhaps the problem was my system.

                    Today I just did a fresh install of Fedora 8 x86_64, and have verified that Fedora 8 RPMs can't be built from any Catalyst installer from 8.1 through 8.5.

                    For each Catalyst release I tried, I used a command like:

                    ./ati-driver-installer-8-5-x86.x86_64.run --buildpkg Fedora/F8 2>&1 | tee fglrx-8.5-f8.log
                    The logs I captured from all five versions of Catalyst can be seen here.

                    Note: this has nothing to do with whether the drivers work on Fedora 8; I've successfully used Catalyst 8.4 on Fedora 8 for some time now. This is purely a packaging issue.

                    Comment


                    • The power just went out for the second time; think I'll either quit for the night or limit myself to short answers

                      Comment


                      • Originally posted by bridgman View Post
                        You might be surprised how close we are to this already. Stay tuned.
                        Well thats cool. I'm always excited to here about new progress.


                        Sure, but I think you're talking about use of Microsoft IP in general not WHQL itself.
                        It's unavoidable. I think this is part of MS plan and ATi is helping them propagate it. Call me paranoid, but the evidence is clear and cant be disputed. MS has been inserting it's own code into various linux projects (like fglrx, mono, and more) and is continuing to do so even today. I dont think MS would have a leg to stand on with there infringement argument without it.


                        We're at 4 now; it just takes time to hire and train. The downside is that while we're bringing in new people and getting them up to speed we aren't working on documentation and drivers, so there's a bit of a balancing thing going on.

                        Most of the work is actually in mesa and drm these days; now that radeonhd has DRI support it can start lighting up the 3D drivers just like radeon.
                        Well, that's better then I was aware. I still feel fglrx should be should killed, and the man power devoted to it should be converted to the open source projects.


                        OK, let me say it differently

                        We are not going to dump our workstation technology into an open source driver. NVidia is not likely to dump their workstation technology into an open source driver. Open source projects work very well but in cases where software technology is part of a company's competitive position opening up the code is not an automatic win.

                        I agree that you normally see benefits from open sourcing projects, but you need to consider that there is a selection process happening before the decision is made, ie in general projects which are expected to benefit from open sourcing get open sourced, and projects which are not expected to benefit from open sourcing do not get open sourced.
                        I disagree. I acnt see one instance where a closed driver would provide some kind of benefit. If you can name one instance where a closed driver might be of some benefit then you might persuade me, but I cant think of a single one.

                        The only thing I can think of is that your thinking that using a closed source driver is somehow going to "hide" something. Come now, I'm not that naive. I may be a noob, but I know full and well that the first thing nVidia does when ATi releases a new driver is reverse compile it, and you do the exact same thing with theres. When ATi releases a new chip the first thing nVidia does is put it under an electron microscope, figuratively and literally, and you guys do the same.


                        I'm not so sure about this one. It's not a question of talent -- there are some *very* good developers working in open source -- but I haven't found a single open source developer willing to commit the time and (mind numbing) effort necessary to make a top-notch gaming or workstation driver. If you don't believe me go ask on any of the IRC channels where the top open source devs hang out. They'll all tell you the same thing -- "we *could* do it but it's not likely we'll ever have time".
                        And that is entirely ATi's fault. 100% they arent actually expected to work for free are they? They have to have a day job. These guys have families to feed just like everybody else. Pay them enough to make it there day job, and put the guys from fglrx working on it full time too. It's not magic. It requires some talent yes, but it also requires some funding and incentive. Bottom line it's ATi hardware and if they want to tap into this market then they need to devote the resources required to make it worthwhile.


                        Sure, I can point you to all kinds of good solutions for hacking HDCP. That's not the point (I can't believe I'm getting into a DRM discussion on Friday night ) -- we will implement legal BluRay playback only because a large customer is willing to buy a huge heap of chips if we do. You as Linux users need to decide what you want the future to look like -- if you want the kind of huge growth that everyone talks about you are going to have to deal with issues like DRM because the user mix will be completely different from what it is today. I'm not sure that's what everyone in the Linux community wants.

                        I don't know how this is going to play out -- for now I am just putting legal playback of protected video (yes, this is a country-specific thing) on the list of functions which so far we only know how to do with a closed-source driver.
                        I'm not much in the mood for debating DRM at the moment, but suffice to say that HDCP itself breaks at least half a dozen laws in the US alone, and by supporting it in your drivers ATi is too. It's simply not worth your time. It will be hacked.

                        I would love someone to try and sue me for using a HDCP hack. I'm a poor person, but I'sd take it to court and run with it becouse I know for a fact beyond that shadow of a doubt that I would win by a significant margin.

                        Bottom line is HDCP really isnt your concern. It is currently being worked on by --far-- more capable open source projects. Your efforts arent wanted or needed.
                        Last edited by duby229; 05-31-2008, 01:39 AM.

                        Comment


                        • Originally posted by duby229 View Post
                          I disagree. I acnt see one instance where a closed driver would provide some kind of benefit. If you can name one instance where a closed driver might be of some benefit then you might persuade me, but I cant think of a single one.
                          I have already offered a couple -- protecting competitive advantage in our workstation and gaming products, and fulfilling legal requirements to protect high-def video content. If you keep saying "those don't count" then I admit I will eventually run out of reasons

                          Originally posted by duby229 View Post
                          The only thing I can think of is that your thinking that using a closed source driver is somehow going to "hide" something. Come now, I'm not that naive. I may be a noob, but I know full and well that the first thing nVidia does when ATi releases a new driver is reverse compile it, and you do the exact same thing with theres. When ATi releases a new chip the first thing nVidia does is put it under an electron microscope, figuratively and literally, and you guys do the same.
                          Actually, I don't think either of us do that. I do remember the days when memory chips and microprocessors were run through a SEM and blown up to cover the floor of a large room (I may still have some rolled up transparencies down in the basement) but it's not worth the effort. The point is not "stealing ideas", it's that when you put something out in open source your ability to legally protect the IP is impaired.

                          I don't think either of us would live long enough to pick through a disassembled copy of a modern driver. The drivers were over 2M lines of code when I joined ATI and maybe ten times that today. It's just not worth the effort to pick through a code base that size. Heck, it would be more efficient to randomly dial extensions at your competitor and ask each person who answered to explain how the technology worked. I'm sure someone would tell you

                          Originally posted by duby229 View Post
                          And that is entirely ATi's fault. 100% they arent actually expected to work for free are they? They have to have a day job. These guys have families to feed just like everybody else. Pay them enough to make it there day job, and put the guys from fglrx working on it full time too. It's not magic. It requires some talent yes, but it also requires some funding and incentive. Bottom line it's ATi hardware and if they want to tap into this market then they need to devote the resources required to make it worthwhile.
                          But if we hire enough developers to write a competitive high performance driver, which would mean that we would end up with about 90% of the open source graphics community on our payroll, how is making it open source going to improve anything ? Almost all the people who could make it better would already be working for us

                          You do understand that most of the developers who contribute to fglrx are working on common code and are not dedicated to Linux, right ? Are you asking us to shut down Windows development in order to make a better open source driver, or just have the Linux-specific people work on open source.

                          Originally posted by duby229 View Post
                          I'm not much in the mood for debating DRM at the moment, but suffice to say that HDCP itself breaks at least half a dozen laws in the US alone, and by supporting it in your drivers ATi is too. It's simply not worth your time. It will be hacked. I would love someone to try and sue me for using a HDCP hack. I'm a poor person, but I'sd take it to court and run with it becouse I know for a fact beyond that shadow of a doubt that I would win by a significant margin.
                          OK, you're losing me here. HDCP is an Intel hardware standard for encrypting the video that goes out to a display. What laws does it break ?

                          Originally posted by duby229 View Post
                          Bottom line is HDCP really isnt your concern. It is currently being worked on by --far-- more capable open source projects. Your efforts arent wanted or needed.
                          Again, I feel like I'm missing something here. HDCP is implemented in hardware; the software just turns it on and off and handles error conditions. What are the open source projects doing ?
                          Last edited by bridgman; 05-31-2008, 03:06 AM.

                          Comment


                          • Sorry Bridgeman. Not trying to be a pain or anything, I promise.

                            I guess, we'll have to agree to disagree. It's a shame though, becouse I'm 100% convinced that if you dropped all support for DRM today, and stopped all further development on fglrz and devoted those resources into open source you'd be able to capture the entire linux market. It wouldnt have any effect at all on your windows market. Just keep on doing what you do there...

                            I may not know the linux graphics subsystem very well, but I'm not entirely retarded. I do have at the very minimum at least a fundamental understanding of micro architecture and fabrication. At the very minimum I understand the fundamental idea's behind modern driver development. I'm no expert by far, but I'm not entirely naive either. I can understand why you publicly deny reverse compiling nvidia's drivers, and looking at the architecture, but come on now.... I've seen an application that could take a binary and convert it directly into C. Granted it didnt have have any comments, and reading through it was very cryptic, but you could still follow the code.

                            And I do understand that most of fglrx is common code. But windows drivers dont just run magically on linux. Take those precious resources and devote them to open source. You'll wind up with a much better product in the end.

                            And again as far as HDCP goes, I really dont want to get into it, but suffice to say that I dont have ready access to the decryption key. I can think of at least half a dozen laws that apply. And becouse of this every possible effort should be made to disable the DRM in hardware, and to circumvent the DRM in software. Those efforts are already underway. Fortunately the open source ATi drivers with Radeon and radeonhd wont support the drm in hardware. The only thing left is circumventing the software. And like I said those efforts are already underway. Sooner rather then later all of us will be able to watch and copy DRM'd content entirely with open source drivers and software. And we'll be able to do 100% transparently.
                            Last edited by duby229; 05-31-2008, 09:16 AM.

                            Comment


                            • I'm guessing you want something like what Intel is doing? Dump money and support and keep everyone happy. Granted it works, and the Linux Intel driver is probably the best out there. However, it would be no surprise to me that once Larabee hits the streets, Intel would pursue its own Linux program and not be as supportive anymore.

                              Comment


                              • Originally posted by Melcar View Post
                                I'm guessing you want something like what Intel is doing? Dump money and support and keep everyone happy. Granted it works, and the Linux Intel driver is probably the best out there. However, it would be no surprise to me that once Larabee hits the streets, Intel would pursue its own Linux program and not be as supportive anymore.
                                Not even really to that extent. I dont like Intels monopolistic practices, but you have to admit they have a much better driver then any of the drivers for ATi. Obviously it works. ATi doesnt even have to go so far as Intel did. Although it would be nice for them to really get involved with open source projects like gem, or gallium, or kernel modesetting. It would be cool if ATi officially supported and funded projects like these exclusively.

                                The way I consider it is that right now it is Intel that is shaping the future of the linux graphics market. nVidia doesnt have a CPU. Unless nVidia comes up with a decent x86 cpu in the next 5 years they will be left in the dust. It's almost certain that nVidia is going to die. And It is Intel that is laying the ground work for the future of graphics in linux. Not ATi.

                                We all know ATi has the better hardware, and I personally dont think Larrabbe has anything interesting to offer. A bunch of broken down simplified x86 cores aint gonna cut it. The ISA just doesnt scale well to multicore. Especially not something so massivelly parallel. I dont think Intel has now or will have in the near future a graphics architecture capable of competing on performance or functionality. However it is beyond a doubt that Intel will take the linux market given current trends.

                                Comment

                                Working...
                                X