Announcement

Collapse
No announcement yet.

Are AMD HD video card users all using older versions of GNU/Linux?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by lordmozilla View Post
    Uhm watch it at 3d support. It only works with AOE and red alert cause they arent actually 3d games.

    Compositing wont work though!
    Oh, I thought they were, that won't work for me then, thanks for letting me know though before I tried it only to be disappointed.
    Last edited by Yfrwlf; 18 May 2009, 11:16 AM.

    Comment


    • #12
      Originally posted by Yfrwlf View Post
      Supporting specific distros shouldn't be needed, only the core programs like Xorg and the kernel IMO. You'll never be able to release Linux drivers on CDs, for example, if this doesn't change, which you need to do because not all kernels out there are the same version. Stability of the APIs and better planning would mean you could release a simple driver that worked across a wide range of kernels. If packaging standards got their act together, drivers on a CD in case a user needed them, like if they *weren't* using a bleeding edge kernel, would become a reality and would really help out Linux users and companies alike.
      I think we already have this covered in principle; the installer package includes open source scripts, different for each distro, maintained by people from the individual distros. During installation, the kernel dependent portions of the driver are compiled using the kernel header files on the user's system.

      There are a number of distros which don't use standard kernel versions, and it's not unusual for a distro to ship kernel code which won't be upstream for another 3-6 months. Fedora is going through a phase like that right now.

      I agree completely that increased standardization between distros would help in a number of areas, but I don't think the lack of standardization today is due to a lack of suitable standards. The developers and distros *want* change and *want* the ability to be different since that lets them deliver what they feel is the best experience for their specific user base. The kernel API changes frequently, not because the kernel devs don't know how to manage a stable API but because they made a deliberate decision to choose faster evolution over API stability. I believe there is agreement that application-level API stability is important but I don't think the same holds for driver-level stability.

      In cases where a stable API is required there are enterprise/LTS distros available which *do* keep a standard kernel and xorg version for several years, backporting changes as needed to support their customers. These enterprise distros are the primary ones used in the professional workstation market, and that's where we have historically focused fglrx support as well.

      There are some architectural changes in the works which will help the situation for Linux and open source drivers, specifically the movement of modesetting and memory management out of xorg and into the kernel driver (drm). This will allow the core graphics functions to evolve in lockstep with the kernel, which is good for bleeding edge distros but introduces new challenges for enterprise/LTS distros. This will also pose a challenge for other OSes which use X but do not use the Linux kernel, such as *BSD and Solaris.
      Last edited by bridgman; 18 May 2009, 11:38 AM.
      Test signature

      Comment


      • #13
        Originally posted by bridgman View Post
        I think we already have this covered in principle; the installer package includes open source scripts, different for each distro, maintained by people from the individual distros. During installation, the kernel dependent portions of the driver are compiled using the kernel header files on the user's system.

        There are a number of distros which don't use standard kernel versions, and it's not unusual for a distro to ship kernel code which won't be upstream for another 3-6 months. Fedora is going through a phase like that right now.

        I agree completely that increased standardization between distros would help in a number of areas, but I don't think the lack of standardization today is due to a lack of suitable standards. The developers and distros *want* change and *want* the ability to be different since that lets them deliver what they feel is the best experience for their specific user base. The kernel API changes frequently, not because the kernel devs don't know how to manage a stable API but because they made a deliberate decision to choose faster evolution over API stability. I believe there is agreement that application-level API stability is important but I don't think the same holds for driver-level stability.

        In cases where a stable API is required there are enterprise/LTS distros available which *do* keep a standard kernel and xorg version for several years, backporting changes as needed to support their customers. These enterprise distros are the primary ones used in the professional workstation market, and that's where we have historically focused fglrx support as well.

        There are some architectural changes in the works which will help the situation for Linux and open source drivers, specifically the movement of modesetting and memory management out of xorg and into the kernel driver (drm). This will allow the core graphics functions to evolve in lockstep with the kernel, which is good for bleeding edge distros but introduces new challenges for enterprise/LTS distros. This will also pose a challenge for other OSes which use X but do not use the Linux kernel, such as *BSD and Solaris.
        I believe that in every case, some sort of standard could be implemented which gave at least basic functionality and basic communication, so that no proprietary code is ever required. If developers seriously consciously have made such decisions to break things, I hope they realise how much pain they cause down the line from them, and how they've basically made Linux be "beta software". When will it have enough features that they will finally release a stable version then I wonder, if ever?

        All this really takes the wind out of my sails for being excited about programming for Linux. I'm not releasing a proprietary package for it. Community software that can hardly manage to make any standards past a binary standard is just really sad to see. Can't even make a standard so that icons for programs get created properly in all desktop environments. Sad...just sad...I wish more Linux users would be conscious of the need for standards so that companies wouldn't be able to get away with making things proprietary just so they can get attention.
        Last edited by Yfrwlf; 18 May 2009, 03:15 PM.

        Comment


        • #14
          Just curious, why do you feel the need to create a proprietary package ? We have to because we're releasing the code in mostly binary form, but if you're releasing source then the distros recognize that they need to help with making your product fit into their package. My experience has been that they generally do this very well.
          Test signature

          Comment


          • #15
            Firstly, we should stop talking about Linux as a product or an operating system. Each distribution is an operating system, which uses a number of open source technologies.

            Moving to another distro means moving to another OS. E.g. Fedora.

            Similarly, if you want stable APIs and less structural change, chose an enterprise OS with long term support. Or choose a distribution known for conservatism. E.g. Slackware or Debian stable. Choosing Ubuntu or Fedora expecting such is silly.
            Expecting these different products to just stick to each other is also silly. I want the bleeding edge of Fedora, but someone else wants the conservatism and vanilla packaging of Slackware. Should Slackware or Fedora change their philosophy just to meet some mandated homogeneity?

            Secondly, it's odd you use OpenGL as something that the kernel guys should aspire to. It seems kernel hacking is as healthy as ever, and most companies involved in Linux are involved in the kernel space. OpenGL on the otherhand is about as stagnant as Duke Nukem Forever.

            Standards aren't some magic cure-all. In the case of Linux you have many different companies offering many different (but similar) products. Then you have disparate products that other companies participate in. In such an environment, it's difficult to decide who creates or maintains the architectural standards. LSB is a pretty decent example. The other problem with standards is that they take a long time to solidify, and a short time to become outdated. Imagine if we had ratified TTM for the next five years, and then got stuck with a memory manager that wasn't adequate. The great strengths of Linux-based systems are flexibility and a willingness to break things "when necessary". We can argue all day about necessity, but I feel this is what allows Linux to be the most "current" of open source kernels. Most heavily governed projects suffer a high rate of developer attrition, e.g. old Solaris or MySQL. FreeBSD are doing OK but being highly centralised but also quite conservative.

            Of course, it would be nice to have more coordination, but I think the current method of having a flexible environment and coordinating when necessary works much better than a overly homogenised system where we spend more time worrying about standards than technology.

            Comment


            • #16
              One thing that bothers me with this not supporting anything bleeding edge is that well nvidia does or atleast their drivers work for example with fedora 11 that Im running. It worked with very unstable (f11) rawhide couple of months ago and it probably will work very soon with (f12)rawhide (from rpmfusion repos) nvidia drivers are always in rpmfusion but amd's keeps dropping very unpredictable, because it seems that they just keeps breaking if the distro isn't any RHEL, SLE* or *buntu.

              It seems their way of doing things on "top of the stack" than
              amd's "next to" is frankly a lot better. I mean it's proprietary blob anyway! If you use it I doubt you really care how invasive it is as long as it works.

              Comment


              • #17
                Oh btw being "forced" to run xorg-x11-drv-ati drivers since Fedora 11 upgrade the video playback with XV looks stunnig! Why are the colors with fglrx driver so "washed out" compared to this free driver? Also fglrx scaling seems broken (both works fine with gl though although a bit more cpu heavier).

                Comment


                • #18
                  Originally posted by bridgman View Post
                  Just curious, why do you feel the need to create a proprietary package ? We have to because we're releasing the code in mostly binary form, but if you're releasing source then the distros recognize that they need to help with making your product fit into their package. My experience has been that they generally do this very well.
                  I do believe I said that I wouldn't create a proprietary package, ever. Linux is already somewhat niche, and programming for a niche of a niche is just suicide and silly. Of course I want every Linux user to use my software, that's one of the entire points of what I'm trying to communicate. You should be able to release one package for Linux, and be done. Right now, that happens to be a plain archived binary, which isn't ideal. The only other solution is Zero Install, which I may choose to support as well or in place of the straight up binary. ZI seems to be the furthest along in improving the Great Linux Packaging Mess, but having a universal format adopted by the main package managers themselves would be best, nullifying the need for a second manager.

                  Source packages help interested developers to help with your project. Binaries help users and developers alike who actually want to quickly and easily run your project. I really don't understand why this is hard concept to grasp...it's completely obvious. Right now, Windows and Mac users enjoy easily installable binary packages. That's what Linux needs. It's a feature. Do users want to be able to select from software from a MUCH bigger repository for use in their cloud software browsers (Add/Remove, Synaptic Package Manager, etc etc) a.k.a. the entire Internet instead of being confined to a small private repository? Yes. Do they want to also be able to easily install software from websites directly as well? Yes. Do developers want to have to release ONE single package, and have that get used, so that all bugs reported to them are valid and not from some tinkered customised version? Yes. So, I want such a feature, and am thus pushing for it.

                  Originally posted by yesterday View Post
                  Firstly, we should stop talking about Linux as a product or an operating system. Each distribution is an operating system, which uses a number of open source technologies.

                  Moving to another distro means moving to another OS. E.g. Fedora.
                  That's the great lie they want to push on you, yes. Linux programs are Linux programs. The main difference between "distros" is they choose to use different package managers that aren't compatible with a universal Linux packaging standard. But no, it's not a "different OS". Binaries can be run on any Linux system because they're all compatible. The main thing that isn't compatible between Debian and Red Hat is their stupid package format difference just because they want to be unique and the companies supporting them want to dig their own proprietary trench in order to get converts to come over to their side (the point of being "proprietary" meaning lock-in). All that is needed is for the community to recognise this deeply impacting Linux issue as a major threat, and push for standards. The companies won't do it for Linux users and developers, because they don't want it.

                  Originally posted by yesterday View Post
                  Similarly, if you want stable APIs and less structural change, chose an enterprise OS with long term support. Or choose a distribution known for conservatism. E.g. Slackware or Debian stable. Choosing Ubuntu or Fedora expecting such is silly.
                  Expecting these different products to just stick to each other is also silly. I want the bleeding edge of Fedora, but someone else wants the conservatism and vanilla packaging of Slackware. Should Slackware or Fedora change their philosophy just to meet some mandated homogeneity?
                  They're called standards. A web browser uses HTML standards. A FTP client uses FTP standards. FreeDesktop.org, the Linux Foundation, and other groups help create standards so that when you're in KDE and you want to run a Gnome app, for example, you can still do so. It empowers Linux users to communicate on the same page at certain points. It does NOT hamper progress, because if someone feels that the standard needs to be broadened, it can be, so that new features can be added. What version of HTML are we up to now, five? OpenGL 3? Those are standards which are quite solid, and they help everyone.

                  Just imagine a world without any standards, where everyone did just go and do their own thing. Lets say there were 500 different website protocols for instance, instead of having everything built on HTML. Imagine trying to function in such a world, it would be horrible, bloated, confusing, and nothing would get done. Every web-based program would have to try to be compatible with all the craziness.

                  I'm in no way saying that there shouldn't be different approaches, and competition, for those things are good. However, with most things, there are definite areas which are similar, and those can be shared. Those are points at which standards should be adopted.

                  For example, lets say you have two different file manager programs. Do they use standards? Already, yes, they use a ton of standards, because they interface with files on a hard drive managed by file systems etc etc etc, and there are standards every step of the way, that's how computers are built, otherwise you'd be screwed with pineapples. As you move up the stack from hardware to software, you encounter several different standards along the way. So, for the file manager program itself, if you recognise that there are several common needs there, and can find a way to utilise a common standard or even a common library in order to work together with other similar projects, of course that's in your interest to do so.

                  If everyone had to re-invent the wheel for everything, NOTHING would EVER get done.

                  Originally posted by yesterday View Post
                  Secondly, it's odd you use OpenGL as something that the kernel guys should aspire to. It seems kernel hacking is as healthy as ever, and most companies involved in Linux are involved in the kernel space. OpenGL on the otherhand is about as stagnant as Duke Nukem Forever.
                  Maybe that's why D3D is becoming more popular recently, so I sure hope that OGL can compete properly and catch back up if it really is lacking like you accuse it of being.

                  But guess what? Let's say that OGL, a standard, changed it's points of communication, it's API, the actual part that is a standard, and implemented OGL 3, 4, 5, 6, 7, 8, 9, 10, etc, and "moved" like you say, and none of those versions happened to at all be compatible with any other version.

                  a) That would be retarded, because there's no reason for it to be completely incompatible like that when they all would use nearly the exact same things, so the stable parts should have a clear API, while the new OGL calls should be added on, and what do you know, that's the way it happens to work.

                  b) No one would use the "standard".

                  The reason why: because that's not a standard. You clearly need to understand what a standard is, and how to implement one so that it doesn't impede progress and development. That's what standards are for, is to not impede progress, but to allow just the communication part to occur.

                  Let me give you a quick example of that. HTML again. How many programs use it? Hmm lets see, Google Maps, Google Docs, Yahoo Mail, Slashdot, all sorts of different programs and content, all using the same standard! So wow, you can have diversity and competition while using standards that are quite static. Maybe OGL needs to get it's tail into gear, maybe that was a poor example, but hopefully by now you are getting some idea of the point of standards, because without them, none of the above mentioned web apps and sites would exist!

                  And the rest of your comment was more of the same so I'll leave it there. Ultimately this is the on-topic point: there is nothing stopping the creation and adoption of some universal intelligent Linux binary packaging systems, the only reason they don't exist is disinterest by distro companies to play nicely with competing companies because they use their repos as a lure to get users to come to them. It's completely pointless and fragments Linux where there is no reason to do so and in an area which is critical for simplifying binary deployment by closed and open source software projects alike.

                  For now we're stuck with crappy archive files for the most part, which aren't very user-friendly to install at all.

                  Originally posted by blindfrog View Post
                  One thing that bothers me with this not supporting anything bleeding edge is that well nvidia does or atleast their drivers work for example with fedora 11 that Im running. It worked with very unstable (f11) rawhide couple of months ago and it probably will work very soon with (f12)rawhide (from rpmfusion repos) nvidia drivers are always in rpmfusion but amd's keeps dropping very unpredictable, because it seems that they just keeps breaking if the distro isn't any RHEL, SLE* or *buntu.

                  It seems their way of doing things on "top of the stack" than
                  amd's "next to" is frankly a lot better. I mean it's proprietary blob anyway! If you use it I doubt you really care how invasive it is as long as it works.
                  It doesn't even work on the major distros, that's one of the points of this thread. I had to downgrade to kernel 2.6.27 and xorg 1.5 (using the Ubuntu Intrepid software bunch) I assume it is since fglrx won't work in the newer versions yet with my 4850x2.

                  Yes, Nvidia has always had quick support of the latest stuff. AMD is smaller, so they're behind in the closed source driver, but ahead in the open one still AFAIK. However, the open one doesn't have 3D, so it's a deal breaker after you plop down $1400 for a new system expecting to play some high-end Linux or Wine games.

                  Comment


                  • #19
                    I am not using GNU/Linux.
                    I am using gentoo. And my card (HD3870) works fine.

                    Comment


                    • #20
                      (re: big post about stable kernel APIs somewhere up the thread)
                      That's a neat idea. You should post that on the LKML, I'd love to see the reaction.

                      Comment

                      Working...
                      X