Announcement

Collapse
No announcement yet.

OpenCL Support Atop Gallium3D Is Here, Sort Of

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by bridgman View Post
    I guess it's possible that r300 may end up as "1.5 plus GLSL" but there may be other driver enhancements required to support GLSL that I haven't realized yet.
    OpenGL 1.5 + GLSL + FBOs is more than enough for the majority of open source games and 3d programs in general. Should R300 reach that state, this can tide us over until R300g is ready.

    Of course, it's always possible that these features are better suited to a Gallium implementation and adding them to R300 would be too much work. In that case, it would probably be enough to add VBOs to R300 and leave everything else to the Gallium driver.
    Last edited by BlackStar; 02 September 2009, 03:39 PM.

    Comment


    • #32
      FBOs are already supported in R300. Piglit still reports some failures, I haven't had the time to check it out.

      The problem though is that the GLSL ARB extensions use different function entry points than GLSL in OpenGL 2.0. And a lot of applications I've looked at do not bother supporting the ARB extension versions - so if they don't see GLSL, they just fall back to the ARB_v_p/ARB_f_p extensions or even fixed function pipeline, even if the GLSL extensions are present.

      That's why I wrote somewhere else that GLSL without OpenGL 2.0 doesn't make too much sense, and that in turn is why I want to focus on r300g there.

      Comment


      • #33
        Originally posted by nhaehnle View Post
        FBOs are already supported in R300. Piglit still reports some failures, I haven't had the time to check it out.

        The problem though is that the GLSL ARB extensions use different function entry points than GLSL in OpenGL 2.0. And a lot of applications I've looked at do not bother supporting the ARB extension versions - so if they don't see GLSL, they just fall back to the ARB_v_p/ARB_f_p extensions or even fixed function pipeline, even if the GLSL extensions are present.

        That's why I wrote somewhere else that GLSL without OpenGL 2.0 doesn't make too much sense, and that in turn is why I want to focus on r300g there.
        Thanks, nice to know about the FBOs.

        OpenGL 3.0+ actually expose 3.0 functionality as extensions to 2.1 without the ARB suffix (for example ARB_framebuffer_object entry points - note, not the older EXT_framebuffer_object). You could do the same for GLSL and expose their 'core' 2.0 entry points without actually supporting the whole 2.0 spec *or* the ARB entry points. Obviously, you cannot claim 2.0-compliance in this case (GL_VERSION should remain 1.5), but this is something quite common in OpenGL drivers (e.g. Ati exposing 3.2 features without supporting the whole 3.2 spec yet).

        There are some obvious compatibility issues with applications that check the ARB_shading_language extension string but fail to check if the entry points are actually available (the specs mandate you must check for both independently), but this is arguably an application issue rather than a driver one.

        Obviously, it might still not make sense to support GLSL in Mesa, but it *is* possible without bending the specs.

        Edit: a couple of questions:

        Is it possible to support glGetInteger(GL_MAJOR) and glGetInteger(GL_MINOR) for OpenGL version queries? This is an OpenGL 3.0 feature, but is trivial to support and sorely missing in older OpenGL versions (there simply isn't any way to parse the string from glGetString(GL_VERSION) with 100% confidence).

        Moreover, OpenGL 3.0 modifies glGetString to return the raw OpenGL version (e.g. '1.5') and nothing else. Mesa currently returns something moronic like '1.4 (Mesa 1.5)', which is AFAIK mandated by GLX (so it's not Mesa's fault, but this doesn't make it any less moronic). Any chance on updating glGetString in Mesa to follow the new specs?
        Last edited by BlackStar; 02 September 2009, 06:30 PM.

        Comment


        • #34
          Originally posted by BlackStar View Post
          Is it possible to support glGetInteger(GL_MAJOR) and glGetInteger(GL_MINOR) for OpenGL version queries? This is an OpenGL 3.0 feature, but is trivial to support and sorely missing in older OpenGL versions (there simply isn't any way to parse the string from glGetString(GL_VERSION) with 100% confidence).
          Well, of course it is possible to add that support, but then you're going to put application developers in a catch 22 situation. After all, they cannot use glGetInteger(GL_MAJOR) before testing the OpenGL version to make sure that that is supported - and once you've tested the OpenGL version, what's the point in testing it again?

          Moreover, OpenGL 3.0 modifies glGetString to return the raw OpenGL version (e.g. '1.5') and nothing else. Mesa currently returns something moronic like '1.4 (Mesa 1.5)', which is AFAIK mandated by GLX (so it's not Mesa's fault, but this doesn't make it any less moronic). Any chance on updating glGetString in Mesa to follow the new specs?
          I suppose this is going to happen eventually. In the meantime, atoi should do the Right Thing anyway, so I really don't think it's such a big deal. And again, application developers have to implement those checks anyway in case they run into older versions, so we don't really win anything anyway.

          Comment


          • #35
            I still don't really understand Gallium3D and state-trackers.

            If, let's say, there is a Gallium3D driver for every card out there and someone makes an OpenCL state-tracker, then do all cards suddenly support OpenCL? Or does each driver needs code for each state-tracker in order to use them?

            And is Gallium3D an API for applications so that an application only has to be written for a state-tracker so that it works on every graphics card that has a Gallium3D driver?

            *head hurts*

            Comment


            • #36
              Originally posted by nhaehnle View Post
              Well, of course it is possible to add that support, but then you're going to put application developers in a catch 22 situation. After all, they cannot use glGetInteger(GL_MAJOR) before testing the OpenGL version to make sure that that is supported - and once you've tested the OpenGL version, what's the point in testing it again?
              It's not that bad: using glGetInteger(GL_MAJOR) will return an INVALID_ENUM error if it's not supported and you can automatically fall back to glGetString(GL_VERSION) (it will not blow up or anything). However, catch-22 describes the current situation pretty well: it's impossible to query the version correctly without knowing the version beforehand (nice work there, Khronos!)

              The point is that it's simpler to call glGetInteger and only fall back to glGetString if it's not supported, since the output of glGetString is not standardized across vendors.

              I suppose this is going to happen eventually. In the meantime, atoi should do the Right Thing anyway, so I really don't think it's such a big deal.
              That's the problem: atoi won't do the right thing with current Mesa! It will return the GLX version instead of the OpenGL version. Even Carmack fell for this back in the original Doom 3 release (IIRC), while popular libraries like GLEE *still* don't get this right.

              Right now, it's simply impossible to retrieve the OpenGL version with 100% confidence in non-3.0 drivers, since the GL_VERSION string changes between vendors (e.g. I can guarantee that my library returns the correct version on Ati, Nvidia, Intel and Mesa/DRI drivers, but this leaves a huge amount of hardware untested - S3, Sis, Matrox, PowerVR/Imageon, etc etc).

              Now that the ARB has revised the specs, Mesa can remove a great deal of ambiguity for existing applications with a trivial modification. Obviously, this won't fix everything, but simply being able to rely on the implementation to do the right thing is no small deal - and Mesa can guarantee that for the whole open source graphics stack.
              Last edited by BlackStar; 03 September 2009, 11:36 AM.

              Comment


              • #37
                Originally posted by V!NCENT View Post
                If, let's say, there is a Gallium3D driver for every card out there and someone makes an OpenCL state-tracker, then do all cards suddenly support OpenCL? Or does each driver needs code for each state-tracker in order to use them?
                In principle, yes. In practice I expect there might be some tweaking required for each driver, in case (for example) the new state tracker used some Gallium3D API combinations which had not been exercised before.

                A state tracker is something that sits on top of the Gallium3D API and exposes a general purpose API to applications. The first state tracker is Mesa itself, which exposes an OpenGL API and which can run over either the classic Mesa HW driver API or over the Gallium3D API.

                Originally posted by V!NCENT View Post
                And is Gallium3D an API for applications so that an application only has to be written for a state-tracker so that it works on every graphics card that has a Gallium3D driver?
                Applications would not normally run over Gallium3D directly. Think of Gallium3D as a low level API which encapsulates HW details to simplify the writing of higher level drivers, ie an open and modern "hardware abstraction layer" API for GPUs. An application written for a specific state-tracker API should work on every card that has a fully implemented Gallium3D driver.
                Last edited by bridgman; 03 September 2009, 11:53 AM.
                Test signature

                Comment


                • #38
                  Originally posted by BlackStar View Post
                  It's not that bad: using glGetInteger(GL_MAJOR) will return an INVALID_ENUM error if it's not supported and you can automatically fall back to glGetString(GL_VERSION) (it will not blow up or anything).
                  Okay, that's true.

                  That's the problem: atoi won't do the right thing with current Mesa! It will return the GLX version instead of the OpenGL version. Even Carmack fell for this back in the original Doom 3 release (IIRC), while popular libraries like GLEE *still* don't get this right.
                  Now I'm really curious. How does atoi not do the right thing? Maybe we have different interpretations of what the natural thing to do is. Here's what I just hacked up as a simple test program:
                  Code:
                  #include <stdlib.h>
                  #include <string.h>
                  
                  static const char * version = "1.5 Mesa 7.6-devel";
                  
                  int main()
                  {
                    int major = atoi(version);
                    int minor = 0;
                    const char * next = strchr(version, '.');
                    if (next) {
                      minor = atoi(next+1);
                    }
                    printf("%i.%i\n", major, minor);
                    return 0;
                  }
                  That definitely parses the string correctly.

                  Comment


                  • #39
                    Originally posted by nhaehnle View Post
                    Now I'm really curious. How does atoi not do the right thing? Maybe we have different interpretations of what the natural thing to do is. Here's what I just hacked up as a simple test program:
                    [...]
                    That definitely parses the string correctly.
                    This code will work correctly iff the driver follows the revised OpenGL 3.0 specs for glGetString and returns the version directly, i.e. "2.1".

                    Right now, Mesa returns a string in the form "1.4 (Mesa 2.1)". Your code will parse this as (major, minor) = (1, 4), when the actual OpenGL version is 2.1 (1.4 is the server version, IIRC).

                    GLEE falls into the same trap:
                    Code:
                    char* version_string = glGetString(GL_VERSION);
                    int major = version_string[0];
                    int minor = version_string[2];

                    Comment


                    • #40
                      Originally posted by bridgman View Post
                      In practice I expect there might be some tweaking required for each driver, in case (for example) the new state tracker used some Gallium3D API combinations which had not been exercised before.
                      That may sometimes be the case (for minimalistic drivers , plus there's always a little glue code needed (in what's called 'the winsys' for historical reasons). However, adding support for a Gallium state tracker in general only requires about as much glue code as adding support for an OpenGL extension does in a traditional Mesa driver.

                      The first state tracker is Mesa itself, which exposes an OpenGL API and which can run over either the classic Mesa HW driver API or over the Gallium3D API.
                      Strictly speaking, the OpenGL state tracker provides the traditional Mesa driver interface and translates that to the Gallium driver interface. For OpenGL 3.x it might make more sense to translate from OpenGL to Gallium directly though.

                      Applications would not normally run over Gallium3D directly. Think of Gallium3D as a low level API which encapsulates HW details to simplify the writing of higher level drivers, ie an open and modern "hardware abstraction layer" API for GPUs. An application written for a specific state-tracker API should work on every card that has a fully implemented Gallium3D driver.
                      Right, as far as apps are concerned, Gallium is an implementation detail they shouldn't need to care about.

                      Comment

                      Working...
                      X