Announcement

Collapse
No announcement yet.

The Status Of Gallium3D Drivers, State Trackers

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • The Status Of Gallium3D Drivers, State Trackers

    Phoronix: The Status Of Gallium3D Drivers, State Trackers

    With the official documentation for the Gallium3D driver architecture being a bit dated, Corbin Simpson (a student X.Org developer that has largely been working on the Gallium3D driver for ATI R300 class hardware) set out to improve the situation. On the X.Org Wiki is now a Gallium3D status page that shows the current status of Gallium3D state trackers and pipes...

    http://www.phoronix.com/vr.php?view=NzQzOA

  • #2
    Wasn't VMWare going to release a Gallium state tracker this summer for ATi GPU's???

    Comment


    • #3
      MESA support for Cell:

      http://mesa3d.org/cell.html

      Comment


      • #4
        A few things.

        - There is a winsys for ATI/AMD GPUs, it's called radeon and it lives in src/gallium/winsys/radeon. glisse wrote it, I refined it.

        - This matrix is still not filled out. r300g is probably the worst-off driver right now; softpipe is the best. I just didn't know the status of the other drivers. Jakob has already filled in i915, and I'm sure somebody can fill in the nouveau drivers.

        - A few of us have already started talking about how to deal with status updates and code changes, so the matrix will probably shift a lot in the next few weeks. This shouldn't be construed as big amounts of new development, just trying to make sense of what's already been written.

        ~ C.

        Comment


        • #5
          Is there a howto/tutorial/initial guide for writing drivers and state-trackers for Gallium3D? Something that can guide new developers not familiar with the architecture.

          I assume softpipe is a good start, but other than looking at the source code, are there any graphs/diagrams explaining the relationship between components? I've only seen the high level diagrams from Akadeny08 presentation on Gallium3D:
          http://akademy2008.kde.org/conferenc...kademy2008.pdf
          which is a good start, but not enough I think.

          Such tutorial should include instructions how to setup gallium3d (for instance, how do you use softpipe?), how to prepare the build environment and the diagrams I mentioned above (with relevant text of course)

          what do you think? Too early for this? (APIs et al. in flux, etc)
          Last edited by ioannis; 08-08-2009, 06:00 AM.

          Comment


          • #6
            I'd be very interested to hear what slide 23 is actually about. Imo you could interpret it so that you could have Direct3D over Gallium3D on Linux (just as easily as OpenGL over Gallium3D on Windows); did Tungsten Graphics actually mean that in 2008?
            Nevermind, commented on elsewhere. Apparently the slide is misleading.
            Last edited by nanonyme; 08-08-2009, 07:22 AM.

            Comment


            • #7
              Would this not theoretically mean we can get rid of WineD3D? Does this have the potential to yield higher performance than Microsoft's implementation?

              Comment


              • #8
                Originally posted by wswartzendruber View Post
                Would this not theoretically mean we can get rid of WineD3D? Does this have the potential to yield higher performance than Microsoft's implementation?
                Native driver support will always be faster than any compatibility layer, so it would have the potential to yield higher performance.
                There are just two problems here:
                We have OpenGL 1.x, 2.x and 3.x, all of which need a somewhat huge amount of time to implement, there's OpenCL and video decoding (I probably missed some other state trackers), adding a D3D state tracker would make things yet even more difficult, and we just don't have enough developers (and the few developers we have don't have enough time) to do this.
                Also keep in mind that any binary driver is unlikely to ever implement native D3D support for Linux; Still, the compatibility layer run with a binary driver's OpenGL implementation will always be faster than any OSS D3D support (fine tuning and stuff).
                The second point is that Wine probably wouldn't be using the native support anyways, which has a number of reasons. For one, it'd be hard to implement D3D support in a way that covers everything from D3D7 to D3D10. Also, we still needed to keep around the old code for compatibility purposes - e.g. for non-Gallium3D OSS drivers and the binary blobs.

                The whole "implement D3D using Gallium3D" thing would basically just lead to more work with only a bit (if at all) performance gain, time would be better spend on improving current drivers and the compatibility layer.

                Comment


                • #9
                  Originally posted by wswartzendruber View Post
                  Would this not theoretically mean we can get rid of WineD3D? Does this have the potential to yield higher performance than Microsoft's implementation?
                  Theoretically at best the same. The slidesheet claims the implementation of Windows driver stack is pretty much the same architecture-wise as it would be if you had Gallium3D and then implemented D3D API on top of it (probably using Wine since there's probablems with D3D requiring parts of WinAPI and graphics developers pretty much don't want a WinAPI implementation in the graphics drivers )
                  We can face the reality that we'll forever want something like Wine on Linux but exactly where this layer is optimal is still a bit of an open question. Would need to be evaluated whether integrating more deeply into Gallium3D would bring benefits. (especially a HLSL->TGSI mapping would be interesting since then you wouldn't have to do HLSL->GLSL->TGSI; I've been told GLSL and HLSL don't map optimally anyway but TGSI is designed to be more generic so it could actually work better there)
                  Also from Wine wiki: "WineD3D eats a number of shader constants for emulating d3d/opengl differences. This causes issues for games that use the maximum number of shader constants (especially SM3.0 games). This causes issues on Geforce6/7 and Radeon X1*00 cards which offer 256 vertex constants of which Wine eats easily 20 and the games expect they can use all 256.."
                  Last edited by nanonyme; 08-08-2009, 09:33 AM.

                  Comment


                  • #10
                    Originally posted by NeoBrain View Post
                    Native driver support will always be faster than any compatibility layer, so it would have the potential to yield higher performance.
                    You don't believe getting rid of an emulation level would yield higher performance? (as in, you wouldn't have to do D3D->OpenGL conversion anymore)
                    Originally posted by NeoBrain View Post
                    There are just two problems here:
                    We have OpenGL 1.x, 2.x and 3.x, all of which need a somewhat huge amount of time to implement, there's OpenCL and video decoding (I probably missed some other state trackers), adding a D3D state tracker would make things yet even more difficult, and we just don't have enough developers (and the few developers we have don't have enough time) to do this.
                    Are you aware that in a state machine trackers are separate of each other? It's mostly a matter on whether Wine developers want to keep using OpenGL or not for D3D.
                    Originally posted by NeoBrain View Post
                    Also keep in mind that any binary driver is unlikely to ever implement native D3D support for Linux; Still, the compatibility layer run with a binary driver's OpenGL implementation will always be faster than any OSS D3D support (fine tuning and stuff).
                    The second point is that Wine probably wouldn't be using the native support anyways, which has a number of reasons. For one, it'd be hard to implement D3D support in a way that covers everything from D3D7 to D3D10. Also, we still needed to keep around the old code for compatibility purposes - e.g. for non-Gallium3D OSS drivers and the binary blobs.
                    Right, so we can't go for the likely better implementation (not using OpenGL) just because closed drivers aren't going to support that (they only want to use OpenGL)? Since when did Linux userland development start depending on the whim of proprietary driver coders? This is as close as native D3D as you can get.
                    Last edited by nanonyme; 08-08-2009, 10:00 AM.

                    Comment


                    • #11
                      I was thinking of VirtualBox.

                      Comment


                      • #12
                        Originally posted by wswartzendruber View Post
                        I was thinking of VirtualBox.
                        That'd be less native, not more.
                        Edit: Technically it might in time work if hooked up with Gallium3D but I've been hearing very contradictory opinions on whether these new CPU emulation technologies actually help at all with speed.
                        Last edited by nanonyme; 08-08-2009, 10:06 AM.

                        Comment


                        • #13
                          All signs point to Tungsten/VMWare having an internal DirectDraw/Direct3D state tracker already, but they haven't confirmed this.

                          Mesa's core has already been adapted into a state tracker for OpenGL 1.0-2.1 support. A few developers (mostly idr and osiris) have been submitting patches for various extensions belonging to OpenGL 3.x. Fortunately, it looks like we're not too far behind the ball this time; the important stuff, like FBO handling changes, will be really easy to adapt to, since we're using real memory managers on the Gallium drivers.

                          I seem to recall zackr mentioning that somebody was working on an OpenCL Gallium state tracker. Also, the preferred video decoding interface would be VDPAU; supporting it or either of the other two advanced video APIs (VA-API, XvBA) will require augmenting g3dvl with a true pipeline. (If you're serious about this, ping me and ymanton on IRC. #nouveau or #dri-devel.)

                          Regarding writing Gallium code... There's really no way around reading the code, but I'll give you a few places to start. Read the headers in src/gallium/include/pipe/p_*.h. The main class of drivers is pipe_context, which contains a few dozen methods for setting state and issuing draw commands. There's also pipe_screen, which contains things like the capabilities of the card you're on, and buffer management.

                          If you're writing a driver, you'll probably fill out your pipe_screen, and then your pipe_context, and most of the code you write will be for managing the state connected to the pipe_context.

                          If you're writing a state tracker, you'll want to get a pipe_screen and pipe_context, and then send drawing commands to it, while interfacing with your frontend.

                          The API is constantly shifting bit by bit to be more sane, but all drivers and state trackers that are in-tree enjoy maintenance and testing on a semi-regular basis, just like the kernel, so it's really not that bad. (I don't know anybody actually using the Cell driver except for the occasional guy on the mailing list, but it still receives updates and API fixes.)

                          Again, if you're serious about writing code, come into #dri-devel; we're more than happy to help you start understanding the code. :3

                          ~ C.

                          Comment


                          • #14
                            Originally posted by MostAwesomeDude View Post
                            All signs point to Tungsten/VMWare having an internal DirectDraw/Direct3D state tracker already, but they haven't confirmed this.
                            Wonder if it's "DirectDraw/Direct3D-like" state tracker or the real thing. (just wondering because with the former you don't need WinAPI at all but might still get trivial mappings of functions and less trivial but still possible mapping of HLSL to TGSI and would work with all Gallium3D drivers)
                            As in, something that's easy to wrap up with VMWare virtualization products later on to get a code path without OpenGL in-between there. Would be a good thing in any case.

                            Comment


                            • #15
                              Originally posted by nanonyme View Post
                              As in, something that's easy to wrap up with VMWare virtualization products later on to get a code path without OpenGL in-between there. Would be a good thing in any case.
                              Would MS be able in the future to forbid other implementations than their own?

                              Comment

                              Working...
                              X