Announcement

Collapse
No announcement yet.

Gallium3D Confusion: State Tracker and DRM

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Gallium3D Confusion: State Tracker and DRM

    So Wikipedia says:

    Gallium3D provides a unified API exposing standard hardware functions such as shader units found on modern hardware. Thus, 3D APIs such as OpenGL 1.x/2.x, OpenGL 3.x, OpenVG, GPGPU infrastructure or even Direct3D (as found in the Wine compatibility layer) will need only a single back-end, called state tracker, targeting Gallium3D API. By contrast Mesa 3D requires a different backend for each hardware platform...
    This leads me to believe that Gallium3D talks directly to DRI2. But the article says later on that Intel and Cell drivers are still under development.

    So...

    Does Gallium3D talk directly to DRI2, or is there another layer in between for each hardware driver?

  • #2
    My understanding is that Gallium3D talks directly to DRI2, since DRI2 is an X server protocol not a driver-level protocol.

    I haven't walked through that part of the code, however.
    Last edited by bridgman; 04-10-2009, 02:14 PM.

    Comment


    • #3
      Originally posted by bridgman View Post
      My understanding is that Gallium3D talks directly to DRI2, since DRI2 is an X server protocol not a driver-level protocol.

      I haven't walked through that part of the code, however.
      So I would think then that all you need to make Gallium3D work is a DRI2 driver. Or wait, maybe DRI2 is, in fact, Mesa. I was thinking that Gallium3D might be able to use the kernel's DRM drivers directly.

      Comment


      • #4
        Hold on thar'

        One of the great sources of confusion with X drivers is that people talk about "DRI drivers" when they really mean "the 3D driver, which uses DRI to find out where it should draw on the screen". 99% of the 3D driver has nothing to do with DRI.

        It depends on whether you are talking about the DRI protocol or the DRI architecture, I guess.

        What you need to make Gallium3D work on a new GPU is a pipe and a winsys driver. Pipe drivers (eg gallium/drivers/r300) are totally GPU-specific, while winsys drivers (eg. gallium/winsys/drm/radeon) are theoretically only OS- and environment-specific but actually end up being a bit HW-specific as wel because drm implementations vary from one GPU to the next.

        Gallium3D drivers should be able to use the existing drm drivers -- but in practice most of the existing drm drivers are based on DRI(1) and no memory management, while the current Gallium3D implementations assume the existence of DRI2 and a kernel memory manager. Six months from now I expect that Gallium3D drivers and classic Mesa drivers will be using the same drm.
        Last edited by bridgman; 04-10-2009, 02:46 PM.

        Comment


        • #5
          So, in theory, if Wine were to move the Direct3D support directly into Gallium, would it be able to use parts of the DirectX API regarding hardware information that OpenGL has no equivalent of?

          I remember awhile ago that Wine has to fake the hardware information because OpenGL does not provide an API equivalent to the DirectX one regarding VRAM and chip type, etc.

          Comment


          • #6
            There are no plans inside Wine to implement DirectX on top of Gallium 3D.

            Wine is meant to support platforms that don't use Gallium3D (Apple, *BSD, the DX-replacement on VirtualBox etc), so the D3D->OGL wrapper needs to be maintained anyway.

            yeah, making more hardware information available would probably help wine, but it doesn't take a whole DX implementation for that.

            Comment


            • #7
              Originally posted by rohcQaH View Post
              There are no plans inside Wine to implement DirectX on top of Gallium 3D.

              Wine is meant to support platforms that don't use Gallium3D (Apple, *BSD, the DX-replacement on VirtualBox etc), so the D3D->OGL wrapper needs to be maintained anyway.

              yeah, making more hardware information available would probably help wine, but it doesn't take a whole DX implementation for that.
              That is why the question was theoretical. In all practicality, the Wine Project would never move DirectX to Gallium3D for the reason you stated. However, Wine could hook into the Gallium framework to get info like that if it were possible. Just like it could do the same with the respective frameworks on Mac OS X, BSD, and even Windows.

              Comment


              • #8
                I wonder if Gallium3D can be used with Wayland.

                Comment


                • #9
                  Originally posted by wswartzendruber View Post
                  I wonder if Gallium3D can be used with
                  Wayland.
                  I don't see why not. G3D is a part of Mesa, not a part of X. Since wayland allows the use of Mesa, G3D should be usable.

                  Comment


                  • #10
                    Nice. Because it seems like desktop Linux could lose a lot of bloat if we ditched X11.

                    Comment


                    • #11
                      Originally posted by rohcQaH View Post
                      There are no plans inside Wine to implement DirectX on top of Gallium 3D.

                      Wine is meant to support platforms that don't use Gallium3D (Apple, *BSD, the DX-replacement on VirtualBox etc), so the D3D->OGL wrapper needs to be maintained anyway.
                      I was under the impression most desktop *nixes (including *BSD's and Solaris) would be slowly going for Gallium3D. I suspect having VirtualBox talk directly to Gallium3D DX tracker would be much more ideal than using Wine DX libraries for converting DX calls to OpenGL calls, then passing them on to Gallium3D as OpenGL.
                      I was asking about this very same thing towards the beginning of this year from developers and they gave the reason that DX requires sections of WinAPI and that's the real reason it would be troublesome to implement it to Gallium3D.
                      Another idea I've heard was Wine skipping Gallium3D altogether and implementing DX by talking to the GPU over the GEMified DRM but that's another story... (And wouldn't likely ever work with closed-source drivers anyway so given nVidia's stand it's kinda moot)
                      Getting offtopic from Gallium3D so I stop here.

                      Comment


                      • #12
                        Originally posted by wswartzendruber View Post
                        Nice. Because it seems like desktop Linux could lose a lot of bloat if we ditched X11.
                        Bloat doesn't come from X11. Enlightenment and XFCE are proof positive of that.

                        Comment


                        • #13
                          Originally posted by Svartalf View Post
                          Bloat doesn't come from X11. Enlightenment and XFCE are proof positive of that.
                          X11 isnt the bloat, X11 is a very good protocol for the purpose it was designed for. The problem is the Xserver is getting bloated because people are trying to mold the X11 protocol into something it isn't. Frameworks like Gallium that move stuff out of the Xserver are a good thing. Frankly, I would like to see a new revision of the X protocol to suit the new demands on graphics these days. An X12 if you will. I doubt we will see it sometime soon, but I think eventually it will happen. The graphics demands are changing rapidly, and something will eventually have to give. The question will be, will it be the platform or the devs?

                          Comment


                          • #14
                            Originally posted by King InuYasha View Post
                            The problem is the Xserver is getting bloated because people are trying to mold the X11 protocol into something it isn't.

                            ... what? Do you have anything resembling technical grounds for claims like that?

                            The X server does relatively little, and is one of the lightest pieces of all in the desktop stack. It handles input, window area management, pixmap management, and then a handful of rendering protocols. That's about it.

                            Even if you cut X out of the picture, you just have to replace it with something that does 80% the exact same thing, and then the other 20% doesn't actually go away, it would just get moved somewhere else.

                            Or you get something like Wayland which -- while cool, yes -- is inherently less flexible and more error prone as it requires far more functionality to be built into the display server itself, since it doesn't rely on a separate window manager or compositing manager process.

                            On top of it all, X is modular. If you do think some particular X extension is bloat, don't worry -- it can be removed. e.g. if you think RENDER is bloat, the good news is that once RENDER is no longer a real gain, it can just be dropped from the server (it's not core protocol and hence applications/toolkits are already required to check for it before using it) and the toolkits can drop support for it as well. Same as if you designed an X12, except you don't have to arbitrarily break the whole desktop in the process.

                            If you really really just want a new protocol Just Because(tm), then you probably want to look at the aforementioned Wayland.

                            Comment

                            Working...
                            X