Announcement

Collapse
No announcement yet.

A New Radeon Shader Compiler For Mesa

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • A New Radeon Shader Compiler For Mesa

    Phoronix: A New Radeon Shader Compiler For Mesa

    While Gallium3D is gaining a lot of momentum and has picked up a number of new state trackers (OpenVG, OpenGL ES, and OpenCL and OpenGL 3.1 is coming soon) and features (i.e. network debugging support) in recent months, there is still a lot of work left before this architecture will enter the limelight...

    http://www.phoronix.com/vr.php?view=NzQxMA

  • #2
    Just wondering what the performance penalty approximately will be having to go though Gallium3D?

    For Linux I'd easier development comes before speed

    Let's say a year from now. How fast/complicated will it be to add support for Gallium3D to a new GPU?

    Comment


    • #3
      I would like to know where a shading compiler stands within the phases of development of a game. Does it have something to do with gaming at all or is it more generic? Can a shading compiler be used to optimize 3D rendering for a particular game or is it intended to optimize 3D rendering by the video driver only?

      Comment


      • #4
        Originally posted by Louise View Post
        Just wondering what the performance penalty approximately will be having to go though Gallium3D?

        For Linux I'd easier development comes before speed

        Let's say a year from now. How fast/complicated will it be to add support for Gallium3D to a new GPU?
        Gallium3D should be faster than Mesa, so it's a step forward no matter how you look at it.

        No idea about the difficulty of adding new Gallium drivers. As a potential metric, try comparing the source size of the mesa-rewrite and R300 Gallium drivers.

        I would like to know where a shading compiler stands within the phases of development of a game. Does it have something to do with gaming at all or is it more generic? Can a shading compiler be used to optimize 3D rendering for a particular game or is it intended to optimize 3D rendering by the video driver only?
        If I understand this correctly, the shader compiler is internal to the driver. Mesa and Gallium both compile shaders into some form of intermediate language, which is then compiled by the driver backend into native GPU binaries.

        Each Mesa driver seems to use a different intermediate language. Gallium uses a single language for all drivers. This announcement means that R300 Mesa and R300 Gallium will use the same intermediate language to simplify the lives of developers.

        I rather doubt the OSS drivers will ship any game-specific optimizations (unless some driver developer happens to be a WoW junkie, that is )

        Please correct me if I am wrong!

        Comment


        • #5
          Originally posted by BlackStar View Post
          Gallium3D should be faster than Mesa, so it's a step forward no matter how you look at it.
          Does this mean, that Gallium can completely deprecate MESA?

          Comment


          • #6
            Thanks for the info, BlackStar.

            Comment


            • #7
              Originally posted by VinzC View Post
              I would like to know where a shading compiler stands within the phases of development of a game. Does it have something to do with gaming at all or is it more generic? Can a shading compiler be used to optimize 3D rendering for a particular game or is it intended to optimize 3D rendering by the video driver only?
              Each game that uses shaders contains their source code most commonly in a textual form. That means shaders need to be compiled by the driver and optimized specifically for your graphics hardware everytime the game starts. This is quite common in PC game industry. The aforementioned shader compiler does exactly that.

              Comment


              • #8
                Originally posted by Louise View Post
                Does this mean, that Gallium can completely deprecate MESA?
                Gallium *is* Mesa (or part of Mesa to be exact). As far as I know, the old Mesa OpenGL stack will be replaced by Gallium, once the Gallium drivers are ready. The old stack will probably stay around for legacy purposes, but new drivers will probably target Gallium from the get-go.

                Originally posted by Eosie View Post
                Each game that uses shaders contains their source code most commonly in a textual form. That means shaders need to be compiled by the driver and optimized specifically for your graphics hardware everytime the game starts. This is quite common in PC game industry. The aforementioned shader compiler does exactly that.
                That's not exactly true. Most games ship precompiled shaders, simply because shader compilation takes a *lot* of time.

                On the other hand, OpenGL does not support precompiled shaders, forcing OpenGL programs to ship with shaders in source form. Most OpenGL developers have been asking for precompiled shaders for *years* (think 2003), but it seems that IHVs haven't been able to decide on a common format.
                Last edited by BlackStar; 07-25-2009, 12:05 PM.

                Comment


                • #9
                  Yep. In the case of Mesa, Gallium3D acts as a new internal API for hardware drivers. The "classic" HW driver API was fine when it was designed, but became complicated over the years as GPUs evolved and the API was extended to support both old fixed-function chips and newer shader-based chips. By replacing the older HW driver API with Gallium3D newer drivers can be written for a simpler and cleaner API.

                  For "classic" mesa drivers, the API used a common IR for shader programs which looked very similar to the instructions used in the ARB_vertex_program / ARB_fragment_program extensions. If you scroll down to line 142 in the following link you can see the list of instructions passed to a Mesa driver under the pre-Gallium HW driver model, showing which instructions are needed for the older ARB and NV extensions, and which are needed for GLSL. In case you're wondering, yes the instructions needed for GLSL are harder to support than the ones which have already been implemented

                  http://cgit.freedesktop.org/mesa/mes..._instruction.h

                  Under a Gallium driver, shader programs are passed to the HW driver as TGSI instructions rather than the previous IR.

                  EDIT - just read Nicolai's post - looks like the plan for now is to convert TGSI instructions into the prog_instruction set in the link above, allowing the existing shader compiler code to start supporting TGSI immediately. Nice.

                  So... bottom line is that the current 3D stack is Mesa (about a million lines of code) running over per-GPU HW drivers via the "classic" HW driver API, which in turn run over libdrm and drm. The "new" 3D stack is Mesa running over per-GPU HW drivers via the Gallium3D HW driver API, again running over libdrm and drm.

                  The neat thing about Gallium3D is that the API spec is less 3D-specific so it's easier to use the same drivers for other cool things like video and general purpose compute operations.
                  Last edited by bridgman; 07-25-2009, 12:29 PM.

                  Comment


                  • #10
                    @BlackStar and bridgman: Sounds almost too good to be true

                    Comment


                    • #11
                      Originally posted by BlackStar View Post
                      That's not exactly true. Most games ship precompiled shaders, simply because shader compilation takes a *lot* of time.

                      On the other hand, OpenGL does not support precompiled shaders, forcing OpenGL programs to ship with shaders in source form. Most OpenGL developers have been asking for precompiled shaders for *years* (think 2003), but it seems that IHVs haven't been able to decide on a common format.
                      I know that, should have been clearer. (not sure about "most games", I personally checked BioShock and its shaders are shipped in HLSL, easily readable) However, the DX driver must still do the chip-specific optimizations at runtime, because the DX binary shader is in an intermediate representation. On the other hand, OpenGL ES has binary shaders through GL_OES_get_program_binary and IR may be chip-specific.

                      Comment


                      • #12
                        I think it depends on the platform. Games (all apps, in fact) shipped for PCs tend to ship shaders in source form, albeit usually "stripped" of comments, since the app needs to run on a wide variety of graphics hardware, each with different shader hardware assembly instructions. Users may also upgrade their graphics hardware after installing the app and expect their programs to take full advantage of the new hardware.

                        Applications which use source-level shaders usually make the API calls to compile the shaders during application startup, so once the app is running there is no overhead from the shader compilation step.

                        Embedded apps, which are expected to run only on a single specific hardware configuration, often use precompiled shaders, which allows a smaller driver stack. There are also API options for OpenGL ES allowing an app to do a "one-time" shader compile at installation or only the the first time an application is run, and then save the HW-specific binary for future invocations of the app.
                        Last edited by bridgman; 07-25-2009, 02:34 PM.

                        Comment


                        • #13
                          Originally posted by BlackStar View Post
                          On the other hand, OpenGL does not support precompiled shaders, forcing OpenGL programs to ship with shaders in source form. Most OpenGL developers have been asking for precompiled shaders for *years* (think 2003), but it seems that IHVs haven't been able to decide on a common format.
                          In my opinion, not having a common binary format is an advantage. It means that
                          a) IHVs can change the opcodes of their hardware to squeeze the most out of it, and
                          b) as driver developers, we can use whatever representation we want internally.

                          The only advantage of precompiled shaders is loading time. This can easily be achieved via a caching mechanism, so that's what ISVs should be asking for, if anything.

                          Comment


                          • #14
                            A question that probably has nothing to do with shading language compiler but what are the reasons why graphics is slower on R400/R500 cards with the open source radeon driver than with fglrx? I suppose the proprietary driver still hasn't delivered all of its secret but do we have an idea what these are?

                            As an example, I got ~1000FPS with radeon driver and ~5000FPS with fglrx on a Dell Inspiron 9400. (Don't make me wrong, 1000FPS is very well indeed.) Does the shading language support bring substantial performance boost compared against previous driver implementations?

                            Comment


                            • #15
                              Originally posted by VinzC View Post
                              A question that probably has nothing to do with shading language compiler but what are the reasons why graphics is slower on R400/R500 cards with the open source radeon driver than with fglrx? I suppose the proprietary driver still hasn't delivered all of its secret but do we have an idea what these are?

                              As an example, I got ~1000FPS with radeon driver and ~5000FPS with fglrx on a Dell Inspiron 9400. (Don't make me wrong, 1000FPS is very well indeed.) Does the shading language support bring substantial performance boost compared against previous driver implementations?
                              There have been a lot of improvements in mesa git master already compared the last mesa release. However, there are lots of optimizations that just haven't been implemented yet in the open driver. The information is there, but no one's had the time to do it yet.

                              Some examples:
                              - OQ support (in progress)
                              - VBO support (in progress)
                              - shader compiler improvements (in progress)
                              - texture tiling
                              - hyperz support

                              Comment

                              Working...
                              X