Announcement

Collapse
No announcement yet.

Gallium3D Gets New Geometry Shader Support

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Yeah, Gallium does look like the bee's knees. Can't wait to see it start to take over.

    Comment


    • #32
      Originally posted by BlackStar View Post
      you still need to install ICD drivers from the IHV's homepage (windows update won't install OpenGL ICDs).
      This is incorrect. I got full OpenGL Catalyst driver from windows update on a mobile Radeon.


      Originally posted by drag View Post
      Hopefully the Linux graphics situation will improve with Gallium.
      No, it won't, and the original topic is a perfect example. It was decided by certain developers that all Gallium drivers must support geometry shaders. From all the in-tree hardware drivers, 8 of them cannot support GS in hardware, so it must be done in software. I am curious who will be implementing it, given the incomplete state of most of the drivers and a lack of manpower.

      /rant

      Comment


      • #33
        Originally posted by Eosie View Post
        This is incorrect. I got full OpenGL Catalyst driver from windows update on a mobile Radeon.
        This has never happened on any of my nvidia, ati and intel systems. Additionally, I haven't been able to find any credible source that supports this. On the other hand, problems from missing ICDs are very common:

        - a recent example on opengl.org

        - a discussion on XBMC

        etc etc

        Originally posted by XBMC discussion
        Yeah, Microsoft does not and has never distributed drivers with Open GL ICDs.

        No, it won't, and the original topic is a perfect example. It was decided by certain developers that all Gallium drivers must support geometry shaders. From all the in-tree hardware drivers, 8 of them cannot support GS in hardware, so it must be done in software. I am curious who will be implementing it, given the incomplete state of most of the drivers and a lack of manpower.

        /rant
        And why do you think this is a problem? Instead of no geometry shaders, older cards will get geometry shaders emulated in software. This is an improvement - it will allow older cards to run software that they otherwise couldn't.

        Comment


        • #34
          Originally posted by BlackStar View Post
          older cards will get geometry shaders emulated in software
          Are you sure? I haven't said anything like that. There is a difference between "must" and "will". Thank god I didn't say drivers "must" support OpenGL 3.2. (and I do contribute code to Gallium, which is why I am concerned about it)

          Originally posted by BlackStar View Post
          And why do you think this is a problem?
          Already answered:
          I am curious who will be implementing it, given the incomplete state of most of the drivers and a lack of manpower.
          ~ Marek

          Comment


          • #35
            Originally posted by Eosie View Post
            Are you sure? I haven't said anything like that. There is a difference between "must" and "will". Thank god I didn't say drivers "must" support OpenGL 3.2. (and I do contribute code to Gallium, which is why I am concerned about it)
            You most certainly did say "must" in your post:
            It was decided by certain developers that all Gallium drivers must support geometry shaders. From all the in-tree hardware drivers, 8 of them cannot support GS in hardware, so it must be done in software
            And I most certainly didn't say anything about OpenGL 3.2 in my reply.

            I simply cannot see how a software fallback for geometry shaders could be a bad thing. As far as I can tell, this code can be shared between all drivers and the effort, non-trivial as it might be, will certainly help the OpenGL stack move forward as a whole (more so than, say, implementing geometry shaders for R600+).

            Do you have a link for the developer discussion on this topic?
            Last edited by BlackStar; 12-29-2009, 07:21 AM. Reason: More context in the quote

            Comment


            • #36
              Originally posted by BlackStar View Post
              Do you have a link for the developer discussion on this topic?
              http://old.nabble.com/geometry-shadi...p26920366.html

              ~ Marek

              Comment


              • #37
                Originally posted by BlackStar View Post
                You most certainly did say "must" in your post:
                I simply cannot see how a software fallback for geometry shaders could be a bad thing. As far as I can tell, this code can be shared between all drivers and the effort, non-trivial as it might be, will certainly help the OpenGL stack move forward as a whole (more so than, say, implementing geometry shaders for R600+).
                If it gets implemented it won't be a bad thing. I think Eosie was just concerned that nobody would care enough, leaving you with a broken driver.

                At any rate, can't every card which supports OpenCL also support any new kind of shader that Microsoft can come up with? I'm not completely sure, but isn't a modern graphics card just a ridiculously parallel pipelined processor without dedicated parts, making OpenGL and OpenCL just abstraction layers?

                For cards that don't support OpenCL, I think it won't be a whole lot useful to implement geometry shaders. They won't be fast enough to run it with an acceptable framerate anyway. The same goes for any shader on cards that don't support GLSL. That will just kill the performance. That's why it could be that nobody cares about implementing it.

                Comment


                • #38
                  Thanks!

                  Originally posted by Remco View Post
                  If it gets implemented it won't be a bad thing. I think Eosie was just concerned that nobody would care enough, leaving you with a broken driver.
                  Well, I would hope that a driver that doesn't support geometry shaders at all wouldn't advertise EXT_geometry_shader or ARB_geometry_shader, meaning that nothing gets broken (correctly written programs must check for driver support before trying to use an extension).

                  At any rate, can't every card which supports OpenCL also support any new kind of shader that Microsoft can come up with? I'm not completely sure, but isn't a modern graphics card just a ridiculously parallel pipelined processor without dedicated parts, making OpenGL and OpenCL just abstraction layers?
                  Not really. DX11 hardware requires a blend of new programmable and fixed-function functionality for its tesselation shaders that (as far as I can tell) cannot be emulated on older hardware. Additionally, there are new features that are simply impossible on DX10- cards: double-precision math, 16K textures, new compression formats (BC6/BC7) and a few more.

                  For cards that don't support OpenCL, I think it won't be a whole lot useful to implement geometry shaders. They won't be fast enough to run it with an acceptable framerate anyway. The same goes for any shader on cards that don't support GLSL. That will just kill the performance. That's why it could be that nobody cares about implementing it.
                  The good thing about software emulation is that (a) you have a reference implementation to compare results with and (b) it allows people without DX10+ hardware to test and contribute code for newer features (e.g. help with the GL3.2 tracker, even without the hardware to back it).

                  Note that many IGPs don't run vertex shaders on hardware, but they still manage to maintain acceptable performance for simple tasks. It's not too much of a stretch that geometry shaders might also perform adequately, given that the relevant hardware on the GPUs isn't terribly fast either.

                  In any case, something is better than nothing. For one, I'd prefer Unigine Tropics to run at 1fps than not run at all. Small steps at a time!

                  Comment


                  • #39
                    Well, even the latest graphics hardware has fixed-function dedicated parts, some of them are:
                    - rasterizer (comes before the pixel shader)
                    - blender and output merger (comes after the pixel shader)
                    - tessellator (between the hull and domain shaders)
                    - texture units

                    The first three are not accessible in OpenCL. Also, from my experience, hardware interfaces appear to be designed tightly around major 3D and compute APIs. You can't schedule the shader cores directly, nor implement any other kind of shader the hardware wasn't designed for.

                    ~ Marek
                    Last edited by marek; 12-29-2009, 10:37 AM.

                    Comment


                    • #40
                      1. The whole reason they were talking about adding geometry shader support to all drivers was that the software support for them was already done. If the hardware doesn't support it, or no one had written the hardware support into the drivers yet, it could automatically fall back to using a vertex shader + shared routines in the draw module.

                      2. AFAIK, the original decision to add support to all the drivers was reversed, because some of the other developers didn't want to advertise support for a feature that would be so slow on their cards since it would have to use a software fallback.

                      Comment

                      Working...
                      X