Announcement

Collapse
No announcement yet.

Mesa Receives Some OpenGL 3 Love

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    If someone here is using Arch Linux - I have a great PKGBUILD script for mesa-git, which compiles r300g driver and installs it. Switching to gallium from classic driver (and vice versa) any time you wish is very convenient: simple terminal console does the trick.

    Comment


    • #17
      Originally posted by FireBurn View Post
      But is replacing it a long term goal?
      Not really. It may be useful to fork mesa, rip out everything not needed for G3D and use it, but that's not a replacement per se.

      Comment


      • #18
        Originally posted by FireBurn View Post
        But is replacing it a long term goal?

        KMS replaced UMS once it became stable in the DDX driver (yes not as much code) and I'm guessing the UMS code will be ripped out of the kernel as soon as Linus lets the developers - well for Intel anyway
        I am not aware of any plans to replace Mesa either in the short term or the long term. The Mesa source code tree includes a number of HW driver subtrees; the intent is to replace many of those existing HW drivers with Gallium3D-based HW drivers over time *and* to use those Gallium3D-based drivers for other things than just Mesa.

        Originally posted by FireBurn View Post
        I'm kind of disappointed that the Nouveau folk changed their minds about using Gallium for the fixed pipeline cards. It would have been nice if eventually all cards were supported natively under Gallium
        The problem is that Gallium3D is really designed for GPUs with hardware shaders; running it on a fixed pipeline GPU would be an interesting science project (and might even be useful) but right now I think the dev priority is getting the things that are *supposed* to work running well.

        Originally posted by FireBurn View Post
        I'd really like to test Gallium and lean how to add to it. Both on the desktop and my PS3 (cell driver). Do you know a good place to start?
        I would definitely start with GL (Mesa) on whatever supported GPU you have. Right now I think the NVidia (esp NV50) and ATI chips (esp 3xx-5xx) have the best support. I don't know if the cell driver can be used directly on PS3 today, maybe start with the desktop.

        It might be a bit early to test the xorg state tracker (a ddx that uses Gallium3D for acceleration rather than GPU-specific code).

        Originally posted by FireBurn View Post
        Also what's the Python statetracker? Also what's the difference between llvmpipe and gallivm? (The v isn't a typo)
        The Python state tracker seems to be a "Gallium3D binding" for Python, ie it allows you to write Python programs which make Gallium3D calls. Seems like a great way to learn how Gallium3D works...

        AFAIK llvmpipe is a software renderer which uses llvm to translate shader programs into optimized CPU code, while gallivm does the same thing but generates GPU shader code and is part of a hardware accelerated driver. Stephane Marchesin's slides from FOSDEM 2009 mention gallivm :

        http://people.freedesktop.org/~march...09-g3dllvm.pdf

        Comment


        • #19
          Not using Mesa as a state tracker does seem to be a long term goal:

          Add a pure Gallium state tracker for OpenGL 3.0. Right now, Gallium uses Mesa as its state tracker. However, since the Mesa source code also implements software rendering, as well as old-style DRI drivers, this results in a lot of cruft and in particular holds back the addition of new OpenGL features (as those features must be supported in the whole mesa first). The performance of Gallium also suffers, since the state tracker does a lot of things that are later on duplicated in Gallium. This project involves duplicating the current Mesa state tracker and removing all the legacy bits. Then the student will add the necessary state tracker functionality for supporting OpenGL 3.0.
          http://wiki.x.org/wiki/SummerOfCodeIdeas

          Comment


          • #20
            I should mention that the emulated GPU on VMWare clients probably has the *best* Gallium3D support right now, but I imagine you want to run on a real GPU not the figment of some developers' imagination

            Comment


            • #21
              Originally posted by whizse View Post
              Not using Mesa as a state tracker does seem to be a long term goal:

              http://wiki.x.org/wiki/SummerOfCodeIdeas
              I read that differently - no mention of getting rid of Mesa, just getting rid of Mesa's old low-level code (software renderer and pre-Gallium3D driver interfaces).

              This would keep the "GL to hardware layer" portions of Mesa (ie the state tracker) and use Gallium3D as the only hardware layer, so software rendering would use Gallium3D and softpipe/llvmpipe instead of the "classic" software renderer.

              In other words, it would get rid of everything in Mesa that *wasn't* part of the state tracker.

              Comment


              • #22
                Aha, thanks for the explanation!

                Comment


                • #23
                  Originally posted by bridgman View Post
                  I should mention that the emulated GPU on VMWare clients probably has the *best* Gallium3D support right now, but I imagine you want to run on a real GPU not the figment of some developers' imagination
                  Is there anybody out there who already uses the Gallium Stack on VMWare? And are there some (working) examples for the python bindings? I'd really like to play with Gallium as well.

                  Comment


                  • #24
                    One thing that's not obvious is just how big and complicated the hardware *independent* part of an OpenGL driver is. Mesa has almost a million lines of hardware-independent code, and proprietary OpenGL drivers are *much* larger.

                    The hardware-dependent part is tiny by comparison - maybe ~20,000 lines for an older GPU and ~50,000 lines for a newer GPU ("classic" mesa HW drivers in both cases), with generally smaller numbers for a Gallium3D driver.

                    Even a 20,000 LOC driver is a non-trivial development effort, of course - maybe 6-7 developer-years for "finished" code, or 2-3 developer-years to get to the level of current "production-ish" open source drivers.

                    Extrapolate that to a code base the size of Mesa, and you see why Mesa keeps evolving rather than being "replaced". That said, Mesa has been evolving for 17 years now and I imagine that most of the code has been completely replaced at least once during that time.

                    Comment


                    • #25
                      The Ohloh site includes rough estimates for how long each project would take to rewrite from scratch based primarily on code size. The estimate for Mesa is 291 developer-years :

                      http://www.ohloh.net/p/mesa

                      Not something you knock together over the weekend

                      Comment

                      Working...
                      X