Announcement

Collapse
No announcement yet.

ATI R300-R500 Gallium3D Driver Is "Mostly" Done

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • ATI R300-R500 Gallium3D Driver Is "Mostly" Done

    Phoronix: ATI R300-R500 Gallium3D Driver Is "Mostly" Done

    It has been a while since talking specifically about ATI's Gallium3D driver, but there is some good news coming out now for the driver that supports the ATI R300 through R500 (Radeon X1000) series hardware. Corbin Simpson, the developer that has largely been working on porting the 3D work from the classic Mesa driver to Gallium3D, has updated the Radeon Feature Matrix page on the X.Org Wiki last night.According to this update, the status of Gallium3D on ATI R300 through R500 hardware has changed from "TODO" and "WIP" to being "DONE" for the Softpipe pass-through and being "MOSTLY" for the core driver...

    http://www.phoronix.com/vr.php?view=NzYwMA

  • #2
    The ATI Gallium3D driver that supports the R600 and R700 (and R800) hardware is still being worked on as well.
    Could you share source of that information? AFAIK AMD wants to /finish/ r300g first, to learn what should be avoided when writing r600g. I was eve talking with Bridgman about this yesterday.
    So, where did you get that info?

    Comment


    • #3
      Hello,

      thanks for the updates on the ati driver status. Can someone explain what features is gallium going to offer against the mesa driver ?

      I read the matrix but i do not understand what parts of the mesa driver are going to be replaced by the gallium driver. For example is it going to replace only the mesa 3d features or other parts of the matrix as well like XXA - EXA acceleration and xv - xvmc video support ? I don't really understand how gallium is working on the existing graphics stack.

      Also i want to ask why there is no gallium on r100 chips?

      Finally, i tried to check on the feature matrix of the intel driver http://www.x.org/wiki/IntelGraphicsDriver but it seems that no one is updating this.

      Comment


      • #4
        Originally posted by Zajec View Post
        Could you share source of that information? AFAIK AMD wants to /finish/ r300g first, to learn what should be avoided when writing r600g. I was eve talking with Bridgman about this yesterday.
        So, where did you get that info?
        It's being worked on still as in it's not done yet either.
        Michael Larabel
        http://www.michaellarabel.com/

        Comment


        • #5
          Michael, I think Zajec is asking where the r600 gallium work is happening (if it is happening). http://cgit.freedesktop.org/mesa/mes...allium/drivers

          Comment


          • #6
            Originally posted by DanL View Post
            Michael, I think Zajec is asking where the r600 gallium work is happening (if it is happening). http://cgit.freedesktop.org/mesa/mes...allium/drivers
            I think he just made that up.

            If someone was working on it, Corbin would have changed the state from TODO to WIP.

            Comment


            • #7
              I think he meant still being worked on as in they haven't forgot about it and it'll get done eventually.

              Comment


              • #8
                So what does this mean? Does this get us OpenGL >= 2.0 (and most importantly GLSL) now? Or is there still work that needs to be done for this, except of debuging of course?
                I had the understanding that as soon as the Gallium3d-driver was done one should be able to just use the OpenGL 2.1 etc state-trackers on it. Is this really something we can expect to be working in the foreseeable (say 3 months) future? If so then development is much faster than I anticipated.

                Comment


                • #9
                  Originally posted by iznogood View Post
                  I read the matrix but i do not understand what parts of the mesa driver are going to be replaced by the gallium driver. For example is it going to replace only the mesa 3d features or other parts of the matrix as well like XXA - EXA acceleration and xv - xvmc video support ? I don't really understand how gallium is working on the existing graphics stack.
                  The Gallium3D driver does two things :

                  - replaces the existing Mesa HW driver API with a new API designed around programmable shaders rather than a fixed-function 3D pipeline
                  - provides a driver API which is sufficiently generic that it can be used for more than just 3D, eg EXA, Xv, video decode etc..

                  The most important thing to understand is that Gallium3D is not a replacement for Mesa itself, only for the HW driver layer inside Mesa (ie the src/mesa/drivers sub-tree). The upper level Mesa code (which is what actually implements the GL API) calls Gallium3D drivers rather than classic Mesa HW drivers.

                  There is a project in the Mesa tree called the xorg state tracker. This is an X driver which calls into KMS for modesetting and uses Gallium3D for EXA and Xv acceleration. The xorg state tracker (aka st/xorg) is the "missing link" which plumbs the non-3D parts of the graphics stack into Gallium3D.

                  http://cgit.freedesktop.org/mesa/mes..._trackers/xorg

                  You would need to configure X to use the "xorg state tracker" driver rather than your existing X driver (radeon/radeonhd).

                  Originally posted by iznogood View Post
                  Also i want to ask why there is no gallium on r100 chips?
                  The Gallium3D API is designed for shader-based GPUs starting with approximately DX9 functionality. The R300 was the first ATI GPU with DX9-level shaders. The R100 basically had no programmable shader hardware. The R200 had a fixed function pipeline *and* DX8-level shader hardware. Right now the plan is to support r300 and up with Gallium3D, then go back and look into whether a subset of Gallium3D could be useful on older parts.

                  Originally posted by Zhick View Post
                  So what does this mean? Does this get us OpenGL >= 2.0 (and most importantly GLSL) now?
                  GLSL is somewhat orthogonal to Gallium3D. The shader compiler is common between classic and Gallium3D drivers; nhaehnle has been looking into adding the necessary support for GLSL to the shader compiler but I think there's still a fair amount of work to be done. In theory the rest of the GL 2.x functionality should come along more or less for free with a Gallium3D driver but I expect every new GL feature will require some fixing.

                  The most important milestone, however, is getting 300g to the point where 3xx-5xx users can switch over to using the Gallium3D code as their primary 3D driver - IMO that's the point where you'll see all the developers pile onto Gallium3D and work there rather than on the classic mesa driver.

                  That is also the point where it seems to make sense to start merging the classic r600 driver and Gallium3D 300g driver to make a 600g driver. In the meantime, Richard is working on adding support for flow control instructions to the 6xx/7xx shader compiler as a first step to supporting GLSL on the 6xx and higher parts.
                  Last edited by bridgman; 10-10-2009, 03:39 PM.

                  Comment


                  • #10
                    Originally posted by bridgman View Post
                    In theory the rest of the GL 2.x functionality should come along more or less for free but I expect every new feature will require some testing and fixing.
                    However, there is a little problem. As far as I know, r3xx-r5xx does not fully support non-power-of-two textures (e.g. the repeat wrap mode and mipmapping) and it is the prerequisite for GL2.0. If r300g gets GL2.x, this will imply falling back to software rasterizer in some situations, as was done in the proprietary drivers, am I right? Or will developers stick to GL1.5 + GLSL? I guess violating the GL spec in order to get GL2.x is out of the question, but that would be a pleasant solution in practice because people could use a newer GL API than what they have now.

                    Comment


                    • #11
                      Originally posted by Eosie View Post
                      However, there is a little problem. As far as I know, r3xx-r5xx does not fully support non-power-of-two textures (e.g. the repeat wrap mode and mipmapping) and it is the prerequisite for GL2.0. If r300g gets GL2.x, this will imply falling back to software rasterizer in some situations, as was done in the proprietary drivers, am I right? Or will developers stick to GL1.5 + GLSL? I guess violating the GL spec in order to get GL2.x is out of the question, but that would be a pleasant solution in practice because people could use a newer GL API than what they have now.
                      Isn't that the whole point of the "Softpipe pass-through" stuff that Michael said was marked done? Performance of apps that require that would be very bad but developers would just have to realize that can't be used on those cards, as I'm sure they already do. And the drivers would fully support GL2 - there's no requirement for doing it at any certain speed.

                      Comment


                      • #12
                        Good news for ATI owners then.
                        I have no clue what Gallium3D is (Better check Wikipedia ), but i like improvements.

                        Comment


                        • #13
                          Bwahaha, it's not the 1st time I regret a little to have only these modern R600 based chips. When I see all the goodness happening on R100-500 first. I guess I'll have to be patient a few more months.

                          Comment


                          • #14
                            Originally posted by smitty3268 View Post
                            Isn't that the whole point of the "Softpipe pass-through" stuff that Michael said was marked done? Performance of apps that require that would be very bad but developers would just have to realize that can't be used on those cards, as I'm sure they already do. And the drivers would fully support GL2 - there's no requirement for doing it at any certain speed.
                            The history tells us not to believe Michael on these things. I highly doubt it's that easy to switch hardware/software rasterizer in real-time since a GPU must share the same data with a CPU, which cannot be done without endlessly moving data around. I'd like someone informed to clear that up.

                            Comment


                            • #15
                              Originally posted by Eosie View Post
                              The history tells us not to believe Michael on these things.
                              A good way to deal with phoronix news is to click on every link that doesn't lead back to phoronix and read them. Most of the time, the news contain no more than a short summary of the linked article, and sometimes Michael rushes the news and gets it wrong.

                              Now I'm not expecting him to be fault-free or an expert in anything he's writing news about. In fact I enjoy the aggregate news, if only to click on the links. I just wish there was a way to prevent the ensuing forum-drama every time something's off

                              Comment

                              Working...
                              X