Announcement

Collapse
No announcement yet.

Gallium3D Now In Mainline Mesa Code-Base!

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Gallium3D Now In Mainline Mesa Code-Base!

    Phoronix: Gallium3D Now In Mainline Mesa Code-Base!

    Gallium3D, the 3D graphics driver that has long been in development by Tungsten Graphics, has finally entered the mainline Mesa code-base! Gallium3D has a lot of capabilities and will be of much benefit to Linux desktop users once all 3D drivers have been ported to this new architecture (for more information read our articles or the Tungsten Wiki). We shared yesterday that Gallium3D was in the process of being merged and that is now completed within Mesa's master branch...

    http://www.phoronix.com/vr.php?view=NzA1Ng

  • #2
    This is actually a really important milestone for open source 3D drivers; while Gallium3D was hanging out in a branch it wasn't really a practical solution for mainstream drivers. Now that it's in master I think you'll see everyone pile on and start working towards shipping Gallium3D-based drivers. Congrats to all involved.

    Comment


    • #3
      Originally posted by bridgman View Post
      This is actually a really important milestone for open source 3D drivers; while Gallium3D was hanging out in a branch it wasn't really a practical solution for mainstream drivers. Now that it's in master I think you'll see everyone pile on and start working towards shipping Gallium3D-based drivers. Congrats to all involved.
      Indeed. It translates into something that could get within spitting distance of the peak performance of the proprietary drivers. It makes the story your employer's got more compelling, don't you think?

      Comment


      • #4
        big props to the developers

        Comment


        • #5
          Originally posted by Svartalf View Post
          Indeed. It translates into something that could get within spitting distance of the peak performance of the proprietary drivers. It makes the story your employer's got more compelling, don't you think?
          Yep; without Gallium3D our comment that we thought open source 3D could fairly easily get to 60-70% of fglrx performance would be pretty lame. The last 30% will be hard though and I don't think anyone will bother; about half of the difference comes from a very sophisticated shader compiler, and the other half comes from constant bottom-to-top tuning and optimizing of the stack, from the GL API down to the bottom of the memory manager and command submission code.

          That said, if you can run a modern GPU at 60-70% of fglrx performance you're probably gonna be CPU-limited on anything but a CAD workstation app anyways
          Last edited by bridgman; 02-11-2009, 11:39 AM.

          Comment


          • #6
            Congrats to all involved!

            Comment


            • #7
              Originally posted by bridgman View Post
              Yep; without Gallium3D our comment that we thought open source 3D could fairly easily get to 60-70% of fglrx performance would be pretty lame. The last 30% will be hard though and I don't think anyone will bother; about half of the difference comes from a very sophisticated shader compiler, and the other half comes from constant bottom-to-top tuning and optimizing of the stack, from the GL API down to the bottom of the memory manager and command submission code.

              That said, if you can run a modern GPU at 60-70% of fglrx performance you're probably gonna be CPU-limited on anything but a CAD workstation app anyways
              That last 30% makes a difference when you're using an integrated card that barely performs (specifically, my Xpress 1100).

              Comment


              • #8
                How mature is the Cell driver? Are we talking Compiz on the PS3 by Klutzy?

                Comment


                • #9
                  benches!!111!!1!
                  however, i wonder why tuning the different pipes, mesa and (intel is going to do this) shouldn't help to improve radeon's gallium3d-performance.
                  Last edited by Regenwald; 02-11-2009, 01:54 PM.

                  Comment


                  • #10
                    The "pipe" driver is the primary hardware-dependent code, so tuning a cell or intel pipe driver wouldn't affect radeon. On the other hand, under Gallium3D the amount of hardware-dependent code is smaller, and really is a good match with the stuff that is different from one GPU to the next.

                    The "winsys" driver used to be a mix of hardware-dependent (command submission) and hardware independent (window interface) functions, but AFAIK that has now been broken up to isolate the hw-dependent code in a separate module from the rest.
                    Last edited by bridgman; 02-11-2009, 02:32 PM.

                    Comment


                    • #11
                      Originally posted by bridgman View Post
                      The "pipe" driver is the primary hardware-dependent code, so tuning a cell or intel pipe driver wouldn't affect radeon. On the other hand, under Gallium3D the amount of hardware-dependent code is smaller, and really is a good match with the stuff that is different from one GPU to the next.

                      The "winsys" driver used to be a mix of hardware-dependent (command submission) and hardware independent (window interface) functions, but AFAIK that has now been broken up to isolate the hw-dependent code in a separate module from the rest.
                      well, if someone tunes let's say the opengl-implementation in mesa or the state tracker (don't know whether i'm right, mean opnecl-, openvg-, 2d-modul in gallium3d on top of the pipe driver), than you'll gain, too. so you could "move" all your tuning action from the fglrx-codebase to gallium/trackers

                      Comment


                      • #12
                        Yes and no. That would work if we had a large Linux-specific performance team, or if we used Gallium3D and Mesa for our Windows and MacOS drivers, but we don't do either of those today.

                        What we do instead is share big chunks of code across multiple OSes, so that tuning work done for one OS benefits users of the other OSes as well. If we had a completely separate code base for Linux drivers then the arguments about "dumping fglrx and concentrating on open source drivers" would make a lot more sense.
                        Last edited by bridgman; 02-11-2009, 02:46 PM.

                        Comment


                        • #13
                          hm and fglrx's move to gallium3d? are there already plans? do you already have a strategy how/when to do this? it's going to be difficult to share code, isn't it? perhaps than, the time for a full os-driver will come..

                          however, some day we have to thank intel that the via/nouveau/ati-driver is that fast...

                          Comment


                          • #14
                            People wanting performance are more interested in a fglrx with less bugs. If I say "fglrx doesn't support this and that" they say "use the open source drivers". But then I say "they suck ass; they're slow" and we're back to square 1. If all they can do is utilizing only 70% of my 300$ card, then no thanks.

                            Comment


                            • #15
                              We moved our 3D stacks onto something similar Gallium3D a few years ago; the big 3D performance jump in September '07 came from the new OpenGL stack hitting the fglrx driver. I don't think there would be any real benefit to changing again.

                              Comment

                              Working...
                              X