Announcement

Collapse
No announcement yet.

fglrx sucks...

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    http://www.csn.ul.ie/~airlied/ says David is an employee of redhat, though as I implied in my post, I have no knowledge of whether they are being indirectly financed by ATI/AMD.

    From what I understand from past phoronix articles, ATI/AMD originally contracted Novell to develop radeonhd under NDA for the new cards while simultaneously releasing specs, though at a later date than the ones available to the devs. The guys from radeon then started to implement features using Atombios and the released specs and did so at a much faster pace.

    Anyways, most of that is idle speculation, but you're definitely right in that it seems to be a major waste of resources. I don't see why we need two open-source drivers for the same cards. With that said, there have been some sharing of code as evidenced by the latest patches to hit radeonhd, which are taken from radeon. Also, it seems that the radeonhd devs have been under some pressure from AMD (http://www.phoronix.com/scan.php?pag...atombios&num=1), possibly due to not providing good returns for the investment heh.

    Comment


    • #12
      Originally posted by duby229 View Post
      and wasting even more time porting your new propriatary code base from windows, and then even more time and money trying to work around your flawed DRM laden video hardware,it would already be done.
      I agree that with you that Ati has wasted time and money on funding radeonhd, but hey it was their first attempt on Open Source.
      But I dont think they wasted their time on polishing fglrx. If there was not fglrx we would not be able to play any newer game even now, since the OSS graphics stack just sucks right now.
      A friend of mine has an linux laptop with an intel chip and while its all cool as long you stick with 2D and simple 3D it totally breaks down when it comes to gaming. Many games dont even start, because there is this or that patented extension missing.

      Then there is DRM, for which a closed source stack is a requirement. Although I dont need it its up to Ati deciding if its worth keeping fglrx just for that.

      My initial idea was to use as much as possible of the OSS driver for fglrx, so the latter would get all the improvements like DRI2, KMS and EXA for free. This more work would be put in the OSS driver while we would as well keep the advantages of an closed source driver.

      Comment


      • #13
        Originally posted by madman2k View Post
        I agree that with you that Ati has wasted time and money on funding radeonhd [...]
        Hm, I can't see why the money is wasted. radeon supports so many old models and it makes sense to start from a new base for only new chips. And afaik the radeon 3d support was only possible because they could use code from radeonhd. For me both projects make perfect sense. Radeonhd seems more stable to me, maybe even "cleaner", it runs perfectly here and I just hope to get 3d for my 780G onboard chip soon...
        And what about the mac modells of graphic cards without atombios? I am happy that there is radeonhd. And even if radeon makes "cooler features", radeonhd can easyly adopt them and everyone is happy?

        Comment


        • #14
          Originally posted by crumja View Post
          Anyways, most of that is idle speculation, but you're definitely right in that it seems to be a major waste of resources. I don't see why we need two open-source drivers for the same cards. With that said, there have been some sharing of code as evidenced by the latest patches to hit radeonhd, which are taken from radeon. Also, it seems that the radeonhd devs have been under some pressure from AMD (http://www.phoronix.com/scan.php?pag...atombios&num=1), possibly due to not providing good returns for the investment heh.
          So in the end what we end up with is two radeon drivers going by different names using the same code (And even more so after the mode setting code gets stripped out and put in the kernel.)....So what exactly is the point?
          Last edited by duby229; 27 July 2008, 12:19 PM.

          Comment


          • #15
            I have to split my responses up a little bit, but I just wanted to thank you so much for bringing up some very good points.

            Originally posted by madman2k View Post
            I agree that with you that Ati has wasted time and money on funding radeonhd, but hey it was their first attempt on Open Source.
            But I dont think they wasted their time on polishing fglrx. If there was not fglrx we would not be able to play any newer game even now, since the OSS graphics stack just sucks right now.
            That is not true at all. Right now the fglrx driver is plagued by instability. Across all product lines. Radeon is fairly stable. It works reliably and in some instances has feature support that fglrx doesnt have. The perfect example of that is Daves recent work on tearless video playback. Another example is seemless VT switching. Yet another example is better wine compatibility. fglrx does not have a single benefit over radeon. Not even a single one. The only problem with radeon is that ATi has --not-- allocated the required resources to get 3d support up to par, and they still have to work around there flawed video hardware due to DRM.

            A friend of mine has an linux laptop with an intel chip and while its all cool as long you stick with 2D and simple 3D it totally breaks down when it comes to gaming. Many games dont even start, because there is this or that patented extension missing.
            Again wrong. Intels open source graphics driver uses Mesa for its OpenGL implementation. And while yes Mesa is not the most modern OpenGL implementation it is new enough to run every single native linux game right now. You may have some problems with wine, but you'd get those same problems using any open source driver, and that is due entirely to the wine developers using opengl functions that arent available in mesa. Blame wine it is 100% there fault.

            Then there is DRM, for which a closed source stack is a requirement. Although I dont need it its up to Ati deciding if its worth keeping fglrx just for that.
            Sorry but this is again wrong. Lets take AACS for example. It is already cracked. You do not need any form of hardware based DRM to play back AACS restricted content. The only thing we are waiting for now is the proper playback support in open source players like mplayer, or xine-lib or ffmpeg. It's going to happen whether ATi or Hollywood likes it or not.... DRM will be cracked. And the DRM hardware will be worked around... Its totally inevitable and any resource devoted to DRM is a complete waste of time.

            My initial idea was to use as much as possible of the OSS driver for fglrx, so the latter would get all the improvements like DRI2, KMS and EXA for free. This more work would be put in the OSS driver while we would as well keep the advantages of an closed source driver.
            Which wont happen. I've explained why in my response to your opening post.

            Comment


            • #16
              Hold on guys, everyone is running off in different directions here. Most of the work done on radeon in the last 6 months has been either fixes and improvements for older chips or adding new acceleration support that we couldn't do yet in radeonhd because dri support wasn't there yet -- and all that code is now in radeonhd as well. Alex works for AMD, Dave works for Red Hat btw, not the other way round.

              The big initial win with radeon was last fall, because it already *had* a lot of code implemented for earlier chips and that code could be used with relatively minor changes for 5xx since the 2d block was the same. In hindsight we probably should have split between 5xx and 6xx rather than between 4xx and 5xx, but these are the things you learn over time.

              RealNC, you are completely missing the point with your comments about open source being the enemy. Our competitors are other hardware vendors, not other software developers -- it's just that writing code and making it public makes not only the code but the concepts and methods available to other hardware vendors. Intel is not in the high performance 3d space right now so they don't have as much to protect in that regard as ATI and NVidia. A year or two from now this will all be academic anyways since the open source memory management will have matured (probably with some help from us) enough that this will not be an issue.

              GPL does nothing to protect the underlying IP, just the actual line-by-line code. The only thing that can protect the underlying IP is software patents, and even those are a relatively unsatisfactory solution.

              crumja, when you talk about "underlying improvements in Mesa" are you talking about Gallium ? Mesa 7.1 is pretty much locked and I'm not aware of any underlying changes there other than a lot of vendor-specific improvements for Intel, Radeon and Nouveau.

              Bottom line, I find this whole thing puzzling and borderline offensive. A lot of the work on memory management over the last year or two was driven by Tungsten Graphics and Red Hat, not just Intel -- but you are completely ignoring their contributions in order to make a better argument to bash us with. Are you sure this is the right thing to do ?

              If Linux were the only OS in the world none of these discussions would be happening and we would probably be doing exactly what you expect. We need to make sure that opening up things for Linux does not hurt our ability to compete in those other OSes, where most of our sales are made. You guys ask us to understand your issues and needs and we try really hard to do that; why can't you do the same for us ?
              Last edited by bridgman; 27 July 2008, 01:53 PM.
              Test signature

              Comment


              • #17
                bridgman: Thanks for clarifying. I thought though that the recent news about r500 and rs690 support all featured the radeon driver.

                As for the Mesa changes, I was referring to "Packard had mentioned that many performance improvements will be taking place from low-level chip optimizations to improving the quality of the Mesa project." from http://linux.chinaunix.net/news/2006-10-23/3000.shtml

                Some questions: Since the hardware docs are out there, will the MM in the OSS drivers ever become as good as the one in fglrx?

                Comment


                • #18
                  crumja: it gets confusing because most of the recent 5xx/690 announcements were related to 3d (mesa, drm) and those components are used by both radeon and radeonhd drivers. The problem was that until recently radeonhd didn't have all the code needed to fully use those 3d components so most of the development work was done with radeon.

                  The big deal about yesterday's merge to radeonhd master is that it puts both drivers on roughly equal footing when it comes to acceleration support. I think there are still a few differences (accelerated rotation comes to mind) but for the first time we are now able to do ongoing acceleration work in radeonhd almost as easily as in radeon, so this is kind of a big deal.

                  Memory management needs HW docs but the biggest challenge is that doing mm right is just a huge design task, since you are optimizing across a bunch of different clients, usage scenarios and CPU/GPU designs.

                  re: Mesa changes, now I understand. That is a 2006 article so many of those changes are already happening. Some of those are Intel initiatives, but a lot of different groups are contributing to improving the X/DRI framework. One important thing to understand is that when Keith talks about framework improvements those are not necessarily Intel projects. Keith was a leader in X development long before he joined Intel and one of his ongoing contributions is periodically providing a "big picture" view of where all the projects (some Intel, some not) are heading so that they can all stay in sync as much as possible. I doubt that even Keith would want you to think of all those as Intel projects.

                  re: will MM in open drivers ever become as good as in fglrx ? Probably, at least I expect it to get as good as fglrx is today. It's mostly the same issue as with the rest of the 3d stack -- the open source community has some really talented developers so the open stack will absolutely go as far as is possible with a relatively "clean" design. Once you get into performance optimizations for specific market segments though, that's where you start to see differentiation between open and closed source code since the open source devs have little interest in doing that kind of optimization work while it is a big part of what lets one HW vendor differentiate itself from the competition. The problem is that the optimizations are expensive to develop but easy to duplicate in another driver -- which is why there is a business argument for keeping them proprietary.
                  Test signature

                  Comment


                  • #19
                    Originally posted by duby229 View Post
                    Again wrong. Intels open source graphics driver uses Mesa for its OpenGL implementation. And while yes Mesa is not the most modern OpenGL implementation it is new enough to run every single native linux game right now.
                    Uh, actually Nexuiz is one native game which is troublesome on Mesa. Using open drivers I *can* play Nexuiz on e.g. a Radeon 9000 mobility (which doesn't support pixel shaders and stuff) with very low details and I assume by now Intel hardware may be on the same level of "OpenGL as it used to be eight years ago", but most likely it'll crash once you try to use some of the more advanced stuff done with GLSL. I have yet to see working, *reliable* GLSL (without segfaults here and then) on any open driver setup. This isn't just a matter of more eye-candy but also a matter of speed.

                    fglrx *does* offer something compared to the open drivers: A rather fast, mostly working and feature-complete 3D stack. Too bad the 2D stack is having its share of problems (video shouldn't be rocket science).

                    What I'd *love* to see of course are open drivers with fast, reliable and up-to-date 3D support - the 2D part apparently works rather good already (yay!) on most hardware. Up to the day this happens I'm mostly glad fglrx is still around to give me something usable for 3D.

                    Rest assure that binary blob won't stay long on my system once the open drivers catch up, though.
                    Last edited by SavageX; 27 July 2008, 03:52 PM.

                    Comment


                    • #20
                      Originally posted by bridgman View Post
                      RealNC, you are completely missing the point with your comments about open source being the enemy. Our competitors are other hardware vendors, not other software developers -- it's just that writing code and making it public makes not only the code but the concepts and methods available to other hardware vendors. Intel is not in the high performance 3d space right now so they don't have as much to protect in that regard as ATI and NVidia. A year or two from now this will all be academic anyways since the open source memory management will have matured (probably with some help from us) enough that this will not be an issue.
                      Hopefully. The original statement claimed that putting the existing but closed MM in the kernel is bad for ATI/AMD. The logical conclusion here is that you actually don't want a good open sourced MM to happen.

                      If you want a good MM in-kernel, why not putting it in there right now? If you don't want a good MM in-kernel because it's bad for business, why would you allow one to happen (in one year with help from you)?

                      GPL does nothing to protect the underlying IP, just the actual line-by-line code. The only thing that can protect the underlying IP is software patents, and even those are a relatively unsatisfactory solution.
                      I simply don't believe that both ATI as well as NVidia have not already completely and thoroughly 101% reverse engineered each other's hardware by now

                      (Of course no one's going to admit that anyway.)

                      You guys ask us to understand your issues and needs and we try really hard to do that; why can't you do the same for us ?
                      Because we see things like this one: http://youtube.com/watch?v=_ImW0-MgR8I&fmt=18 and want to have this experience too! You're not giving it to us. Intel and NVidia does

                      Edit:
                      I *do* have all those effects active as we speak... with xf86-video-ati that is. Yes, on an R580. fglrx is completely non-suitable for a good working compiz fusion.
                      Last edited by RealNC; 27 July 2008, 04:02 PM.

                      Comment

                      Working...
                      X