Announcement

Collapse
No announcement yet.

The Fallacy Behind Open-Source GPU Drivers, Documentation

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by marek View Post
    Get a machine you are comfortable with. Your main machine is okay. There may be important data, don't worry, hardware hard lockups and failures won't damage them. Set up your development environment, so that you can write some driver code, compile it, and test it with a 3D app at any time (no matter whether you are on a plane or a train or at school... you can make drivers anywhere).

    I only switch machines when I need to test a GPU which is not in my laptop (or when I want a machine that compiles code faster than my laptop ).
    Is there a "Hello world" gallium state tracker available anywhere?

    Comment


    • There is a python state tracker which AFAIK allows you to run Gallium3D API commands from a simple python program.

      Comment


      • kirillkh and others> The basic inescapable fact is you won't be able to achieve anything useful in the 3D driver development world without deep knowledge of either OpenGL or Direct3D. As a driver developer, you fully implement the OpenGL API, so you should know it more or less fully!

        Comment


        • Originally posted by marek View Post
          Get a machine you are comfortable with. Your main machine is okay. There may be important data, don't worry, hardware hard lockups and failures won't damage them. Set up your development environment, so that you can write some driver code, compile it, and test it with a 3D app at any time (no matter whether you are on a plane or a train or at school... you can make drivers anywhere).

          I only switch machines when I need to test a GPU which is not in my laptop (or when I want a machine that compiles code faster than my laptop ).
          I think the key here seems to be that new developers should focus solely on improving bits and pieces of Mesa at first. This is easy to get into and won't constantly crash your machine. Moving on to work on the kernel KMS bits is much more likely to cause crashes, more difficult to debug, etc., and should only be done after someone is much more comfortable. Probably start by looking into a couple other kernel device drivers, that work with simpler hardware, just to get familiar with kernel development.

          Comment


          • Originally posted by marek View Post
            kirillkh and others> The basic inescapable fact is you won't be able to achieve anything useful in the 3D driver development world without deep knowledge of either OpenGL or Direct3D. As a driver developer, you fully implement the OpenGL API, so you should know it more or less fully!
            Marek, is "Addison-Wesley OpenGL SuperBible 4th Edition" enough?

            Comment


            • Originally posted by Drago View Post
              Marek, is "Addison-Wesley OpenGL SuperBible 4th Edition" enough?
              I haven't read that book (only heard about it), but I think that being able to do something like bump mapping with OpenGL and GLSL on a rotating model is enough.

              Comment


              • Originally posted by dargllun View Post
                and on @mythtv-users you don't even read the word AMD anymore these days...
                Need to correct myself, today we read this on @mythtv-users, in direct response of a question asking for recommended HTPC components:

                As far as i know there is no amd chipset currently worth recommending you are limited to nvidia if you want a reasonably quiet frontend (non too powerful cpu and therefore no fast fans cooling it).
                This really sucks...

                Comment


                • since amd chipsets need a lot less power than nvidia chipsets: what the f?ck are you talking about?

                  Comment


                  • Originally posted by energyman View Post
                    since amd chipsets need a lot less power than nvidia chipsets: what the f?ck are you talking about?
                    You're missing the context here. There was an argument as to whether the OSS drivers lag behind due to missing released documentation, I said there's enough stuff to do with given documentation, such as a vdpau state tracker for AMD GPUs, but still that's not happening. Basically that's backing the original article. The end result for AMD users is that those GPUs have factually become irrelevant for media usage.

                    Energy consumption wasn't the point here.

                    Comment


                    • Originally posted by energyman View Post
                      since amd chipsets need a lot less power than nvidia chipsets: what the f?ck are you talking about?
                      A lot less power eh?

                      http://www.silentpcreview.com/article902-page6.html

                      Comment


                      • It's a very good start in that direction. Be forewarned, while the book's a GREAT way of doing things, unless you've got OpenGL 3.0 and beyond drivers available, you'll want to start off with the Fourth Edition, still in print.

                        Comment


                        • Originally posted by marek View Post
                          I haven't read that book (only heard about it), but I think that being able to do something like bump mapping with OpenGL and GLSL on a rotating model is enough.
                          It's a good book. And the authors do their best to make it an approachable subject. (It doesn't hurt that I know two of the people that did that book either... :-D)

                          Comment


                          • Originally posted by dargllun View Post
                            You're missing the context here. There was an argument as to whether the OSS drivers lag behind due to missing released documentation, I said there's enough stuff to do with given documentation, such as a vdpau state tracker for AMD GPUs, but still that's not happening. Basically that's backing the original article. The end result for AMD users is that those GPUs have factually become irrelevant for media usage.

                            Energy consumption wasn't the point here.
                            No, it's not done yet. But it's more due to lack of manpower that it's not happened yet than with the "fallacy" that was broached by Michael in the original article. At the time we'd initially asked for it all and said what we'd said, chips were vastly simpler than they are today- I should know, I was one of the devs that was clamoring for better information back then. Utah-GLX was the thing for OpenGL on Linux and DRI was just getting started. In those days, you had OpenGL 1.3 capabilities and MAYBE a bit of of shader support as vendor supported Extensions. Things were vastly simpler then.

                            Nowadays?

                            You've got a system that's actually as complex or moreso than the whole rest of the computer that's driving it. It's very much removed from the days when I was doing it as a full-time evening project for everyone along with the likes of Gareth Huges and John Carmack.

                            Comment


                            • Also, FWIW, if you learn GL or DX, the 3D hardware looks A LOT like the APIs as you would expect. Lots of things map almost 1:1. My advice to learning GPU programming, follow the mailing lists and ask questions. Look at patches and get a feel for the code. Developers are happy to help out if you ask questions about specific things, but they are probably going to be turned off by requests to lead you by the hand through the basics of how computer graphics work. Just about everyone I know who works on GPU drivers got into it as a hobby. It's not a silo. Not only is it helpful to know 3D APIs like GL or DX, it also helps to know how computer hardware works in general. Study a simple NIC or sound driver. Most hardware is basically a DMA engine of some sort. It's either reading or writing data from ram. In between the hardware does something to it. Once you get that down, it doesn't much matter whether it's a wave file or vertex buffer; it's just an input to some chip that's going to output something else.

                              Comment

                              Working...
                              X