Announcement

Collapse
No announcement yet.

Have the drm.git kernel modules been abandoned?

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by pvtcupcakes View Post
    If Windows updated it's kernel as often as Linux, they'd have the same problem too. Only it'd be worse because you'd be relying on Vendors to update their drivers.
    Well, cool thing that they don't change their kernel that often then... Plus, I've had success installing XP drivers on Vista, so, there must be something working there, and as far as I can tell they've got a preety stable and secure kernel by now.

    I just read the article "Ant P." told me to, and I honestly didn't find anything new in there... So, we should just ship trillions of drivers with the kernel (or it's source) in order to solve compiler problems now?

    Idk what to do about the "new interface" -> "old interface" compatibility problems, but I guess that's why linux is so unsupported. If linux is all about freedom why aren't companies given the freedom to release binary blobs? (and I mean ones that would work for more than a single kernel version's lifespan)... The end user doesn't really care if things are opensource or not...

    Comment


    • #32
      Originally posted by RealNC View Post
      do you see NVidia or AMD being able or willing to get their binary blobs opened and included in the kernel?
      Uh, YES. In fact I *DO* see AMD able, willing, and ACTIVELY DOING so... Did you read the front page news today? http://www.phoronix.com/scan.php?pag...item&px=NzU0Nw

      Somebody else is working on the nvidia stuff since nvidia still has a stick shoved up their a$$, but if AMD can do so, then SO CAN nvidia (no more excuses here), all they need is to be willing.

      Comment


      • #33
        Originally posted by mdias View Post
        The end user doesn't really care if things are opensource or not...
        ...no, but the GPL does...

        Comment


        • #34
          Originally posted by mdias View Post
          Idk what to do about the "new interface" -> "old interface" compatibility problems, but I guess that's why linux is so unsupported. If linux is all about freedom why aren't companies given the freedom to release binary blobs? (and I mean ones that would work for more than a single kernel version's lifespan)... The end user doesn't really care if things are opensource or not...
          But that's the _point_ of Linux; that the source is available. If the end user doesn't care about the source than they are free to pick another OS.

          And I certainly wouldn't say "Linux is so unsupported." I'd argue that it supports about as much hardware as a given version of windows in most cases.

          It's not the lack of a stable internal kernel interface that's preventing more widespread support in certain markets, it's lack of marketshare. For example, Linux has great hardware support in the server and embedded spaces (where it has significant market share), but not as much in things like home-user desktop. Companies will support whatever interface they need to support if the potential revenue warrants it.

          It's generally more of a mindset problem. Most companies are used to dealing with windows. It takes time to adapt to another way of doing things.

          Comment


          • #35
            Originally posted by agd5f View Post
            And I certainly wouldn't say "Linux is so unsupported." I'd argue that it supports about as much hardware as a given version of windows in most cases.
            for mainstream consumer hardware, it's often the other way around.

            Using Linux, every piece of hardware I own is supported out of the box. I just need to manually install the binary GPU drivers for 3D support - but hopefully, that'll change.

            Using Windows, I also need to manually install the binary GPU drivers.
            - On XP, I also need to manually install drivers for my NIC, the SATA controller (much joy unless you still use an IDE HD) and the sound card.
            - On Vista, NIC and SATA work fine, but there is no way to use my sound card. Neither Microsoft nor Creative care to provide drivers (of course the old ones don't work, windows doesn't have a stable kernel interface either).

            There are similar results for my laptop, and windows on the GPU-less server doesn't even work.


            For me, Linux has better hardware support on 3 out of 3 computers, even though every part I bought was "designed for Windows".

            Comment


            • #36
              Originally posted by rohcQaH View Post
              For me, Linux has better hardware support on 3 out of 3 computers, even though every part I bought was "designed for Windows".
              Well, that claim mostly only means "the vendor can only be held responsible if this doesn't work with Windows", nothing more.

              Comment


              • #37
                Originally posted by lbcoder View Post
                ...no, but the GPL does...
                I never said I'd want to GPL my code if I were to release my device's driver...

                Originally posted by agd5f View Post
                But that's the _point_ of Linux; that the source is available. If the end user doesn't care about the source than they are free to pick another OS.
                So because a user would like to develop an app/game without giving up the source he shouldn't target linux?

                Originally posted by agd5f
                And I certainly wouldn't say "Linux is so unsupported." I'd argue that it supports about as much hardware as a given version of windows in most cases.
                You're right, I made a poor choice of words, I meant developers don't target linux so much.

                Originally posted by agd5f
                It's not the lack of a stable internal kernel interface that's preventing more widespread support in certain markets, it's lack of marketshare. For example, Linux has great hardware support in the server and embedded spaces (where it has significant market share), but not as much in things like home-user desktop. Companies will support whatever interface they need to support if the potential revenue warrants it.

                It's generally more of a mindset problem. Most companies are used to dealing with windows. It takes time to adapt to another way of doing things.
                Linux webservers certainly don't need bleeding edge technology, it doesn't need constantly updated drivers and kernerls to support those tasks.

                Home-user desktops also tend to be less tech-savvy and won't get through the trouble of compiling the latest DRM branch or install a new kernel.

                It's true that companies will do whatever it takes to get money, however they'll also choose the easiest way to do so, and unfortunately, currently that's on Windows. I'm hoping Gallium 3D will improve gaming on linux dramatically, however I wouldn't bet on that.

                [edit] on another note; who would be held responsible if my open source gfx driver wiped out the BIOS of the card? I'd love to use a stable, fast driver if it was closed source. At least I know I'd get support from that company even if I ever needed (they have their name to defend, unlike OSS devs). Sure, fglrx sucks for the normal user and as such I currently use the open source drivers that are going strong

                Please don't get me wrong, I'm not saying the oss devs don't support the users; I'm saying that they don't HAVE to.
                Last edited by mdias; 09-21-2009, 11:33 AM.

                Comment


                • #38
                  Home-user desktops also tend to be less tech-savvy and won't get through the trouble of compiling the latest DRM branch or install a new kernel.
                  They are not meant to, distributions will do that for them.

                  Bleeding edge drivers that haven't been fully written yet are a separate category, and not something for the less tech-savvy home-user desktops.

                  Comment


                  • #39
                    Originally posted by pingufunkybeat View Post
                    They are not meant to, distributions will do that for them.

                    Bleeding edge drivers that haven't been fully written yet are a separate category, and not something for the less tech-savvy home-user desktops.
                    It still looks they'll have to install a new kernel everytime an updated driver is out for their hardware, and has been said before, that kernel could have regressions in other areas.

                    Comment


                    • #40
                      Originally posted by mdias View Post
                      on another note; who would be held responsible if my open source gfx driver wiped out the BIOS of the card?
                      You're right. You. Then again, BIOS isn't afaik too trivial to accidentally wipe...

                      Comment


                      • #41
                        Originally posted by mdias View Post
                        So because a user would like to develop an app/game without giving up the source he shouldn't target linux?
                        I'm talking specifically about drivers. The kernel is open source and driver developers should target upstream inclusion if they want their driver to work well on Linux.

                        Originally posted by mdias View Post
                        Home-user desktops also tend to be less tech-savvy and won't get through the trouble of compiling the latest DRM branch or install a new kernel.
                        They are not supposed to. They use whatever the oem puts on the system. In most cases this is windows. For those users wanting to try Linux, the distro packages everything for you. Until Linux gains more desktop market-share, there will tend to be a delay between desktop hardware availability and driver availability. If you want to use the bleeding edge now, you need to use development trees.

                        Comment


                        • #42
                          Originally posted by lbcoder View Post
                          Uh, YES. In fact I *DO* see AMD able, willing, and ACTIVELY DOING so... Did you read the front page news today? http://www.phoronix.com/scan.php?pag...item&px=NzU0Nw
                          With all due respect to what AMD is doing, they're building an open source driver that might be a little over half as fast as the blob when it's fully operational. That's kinda like telling you how to make a Formula One car go 100mph instead of 200mph. Which is just about any way you make it use the engine and not go out and push.

                          Certainly, it's very nice that they do give away that much and that they reveal the hardware interface at all, But I don't think that AMD is giving away any tricks that'll make them come out 2 FPS behind instead of 2 FPS ahead in the next GPU shootout. Those kinds of optimizations are in the blob and staying in the blob for the foreseeable future.

                          I think Linux would do better pushing for universal standards, like for example how all USB sticks conform to a USB mass storage device spec - no need for separate drivers. Same with webcams and USB video spec. Create a few more like those and there is little reason to have a million drivers in the first place.

                          Comment


                          • #43
                            Originally posted by Kjella View Post
                            With all due respect to what AMD is doing, they're building an open source driver that might be a little over half as fast as the blob when it's fully operational. That's kinda like telling you how to make a Formula One car go 100mph instead of 200mph. Which is just about any way you make it use the engine and not go out and push.
                            It's not so much an information thing as an effort thing. Squeezing out every last drop of performance is a major amount of engineering. There's enough information out there to write a driver that comes close to the performance of the closed driver, but there are not enough developers to actually realize that. To use your analogy, the instructions and engine upgrades are there to go 200 mph, but there aren't enough mechanics to install it all. The cost of lots of extra mechanics is not justified if only 2 people are actually going to watch the race.

                            Comment


                            • #44
                              Originally posted by Kjella View Post
                              With all due respect to what AMD is doing, they're building an open source driver that might be a little over half as fast as the blob when it's fully operational. That's kinda like telling you how to make a Formula One car go 100mph instead of 200mph. Which is just about any way you make it use the engine and not go out and push.

                              Certainly, it's very nice that they do give away that much and that they reveal the hardware interface at all, But I don't think that AMD is giving away any tricks that'll make them come out 2 FPS behind instead of 2 FPS ahead in the next GPU shootout. Those kinds of optimizations are in the blob and staying in the blob for the foreseeable future.
                              Just to be clear, the plan has always been that we provide documentation, sample/initial code and developer support, then the open source community writes and maintains the drivers. In the case of 6xx/7xx 3D, the community developers were mostly focused on other areas like KMS and GEM/TTM, so we offered to do most of the initial coding, and that is mostly completed. What happens next in terms of performance and features is up to you.

                              We are going to keep helping in what we see as key areas (which may include performance), and we are going to keep adding support for new hardware, but you should *not* be expecting us to write the drivers ourselves or complaining if we don't dump enough proprietary code into the open source stack. We have a proprietary driver for that.

                              I don't think anyone is trying to match the performance and features of the Catalyst driver BTW; the goal is to provide a driver set which meets the needs of most desktop users, which can stay current with (and be used for the development of) the evolving X/DRI framework, and which can be directly supported by Linux distribution developers.
                              Last edited by bridgman; 09-21-2009, 01:34 PM.

                              Comment


                              • #45
                                Originally posted by bridgman View Post
                                Just to be clear, the plan has always been that we provide documentation, sample/initial code and developer support, then the open source community writes and maintains the drivers. Our job was to make sure there is enough information out there for the development community to program the hardware, *not* to write the drivers ourselves. I would rather have our team working on Evergreen or trying to open up UVD.
                                I at least appreciate the effort that's led into imo amazingly fast progress. You guys definitely got to my personal Wall of Beer. (members eligible for a round if they pop by )

                                Comment

                                Working...
                                X