Announcement

Collapse
No announcement yet.

OpenCL Support Atop Gallium3D Is Here, Sort Of

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • BlackStar
    replied
    Originally posted by nhaehnle View Post
    Well, of course it is possible to add that support, but then you're going to put application developers in a catch 22 situation. After all, they cannot use glGetInteger(GL_MAJOR) before testing the OpenGL version to make sure that that is supported - and once you've tested the OpenGL version, what's the point in testing it again?
    It's not that bad: using glGetInteger(GL_MAJOR) will return an INVALID_ENUM error if it's not supported and you can automatically fall back to glGetString(GL_VERSION) (it will not blow up or anything). However, catch-22 describes the current situation pretty well: it's impossible to query the version correctly without knowing the version beforehand (nice work there, Khronos!)

    The point is that it's simpler to call glGetInteger and only fall back to glGetString if it's not supported, since the output of glGetString is not standardized across vendors.

    I suppose this is going to happen eventually. In the meantime, atoi should do the Right Thing anyway, so I really don't think it's such a big deal.
    That's the problem: atoi won't do the right thing with current Mesa! It will return the GLX version instead of the OpenGL version. Even Carmack fell for this back in the original Doom 3 release (IIRC), while popular libraries like GLEE *still* don't get this right.

    Right now, it's simply impossible to retrieve the OpenGL version with 100% confidence in non-3.0 drivers, since the GL_VERSION string changes between vendors (e.g. I can guarantee that my library returns the correct version on Ati, Nvidia, Intel and Mesa/DRI drivers, but this leaves a huge amount of hardware untested - S3, Sis, Matrox, PowerVR/Imageon, etc etc).

    Now that the ARB has revised the specs, Mesa can remove a great deal of ambiguity for existing applications with a trivial modification. Obviously, this won't fix everything, but simply being able to rely on the implementation to do the right thing is no small deal - and Mesa can guarantee that for the whole open source graphics stack.
    Last edited by BlackStar; 03 September 2009, 11:36 AM.

    Leave a comment:


  • V!NCENT
    replied
    I still don't really understand Gallium3D and state-trackers.

    If, let's say, there is a Gallium3D driver for every card out there and someone makes an OpenCL state-tracker, then do all cards suddenly support OpenCL? Or does each driver needs code for each state-tracker in order to use them?

    And is Gallium3D an API for applications so that an application only has to be written for a state-tracker so that it works on every graphics card that has a Gallium3D driver?

    *head hurts*

    Leave a comment:


  • nhaehnle
    replied
    Originally posted by BlackStar View Post
    Is it possible to support glGetInteger(GL_MAJOR) and glGetInteger(GL_MINOR) for OpenGL version queries? This is an OpenGL 3.0 feature, but is trivial to support and sorely missing in older OpenGL versions (there simply isn't any way to parse the string from glGetString(GL_VERSION) with 100% confidence).
    Well, of course it is possible to add that support, but then you're going to put application developers in a catch 22 situation. After all, they cannot use glGetInteger(GL_MAJOR) before testing the OpenGL version to make sure that that is supported - and once you've tested the OpenGL version, what's the point in testing it again?

    Moreover, OpenGL 3.0 modifies glGetString to return the raw OpenGL version (e.g. '1.5') and nothing else. Mesa currently returns something moronic like '1.4 (Mesa 1.5)', which is AFAIK mandated by GLX (so it's not Mesa's fault, but this doesn't make it any less moronic). Any chance on updating glGetString in Mesa to follow the new specs?
    I suppose this is going to happen eventually. In the meantime, atoi should do the Right Thing anyway, so I really don't think it's such a big deal. And again, application developers have to implement those checks anyway in case they run into older versions, so we don't really win anything anyway.

    Leave a comment:


  • BlackStar
    replied
    Originally posted by nhaehnle View Post
    FBOs are already supported in R300. Piglit still reports some failures, I haven't had the time to check it out.

    The problem though is that the GLSL ARB extensions use different function entry points than GLSL in OpenGL 2.0. And a lot of applications I've looked at do not bother supporting the ARB extension versions - so if they don't see GLSL, they just fall back to the ARB_v_p/ARB_f_p extensions or even fixed function pipeline, even if the GLSL extensions are present.

    That's why I wrote somewhere else that GLSL without OpenGL 2.0 doesn't make too much sense, and that in turn is why I want to focus on r300g there.
    Thanks, nice to know about the FBOs.

    OpenGL 3.0+ actually expose 3.0 functionality as extensions to 2.1 without the ARB suffix (for example ARB_framebuffer_object entry points - note, not the older EXT_framebuffer_object). You could do the same for GLSL and expose their 'core' 2.0 entry points without actually supporting the whole 2.0 spec *or* the ARB entry points. Obviously, you cannot claim 2.0-compliance in this case (GL_VERSION should remain 1.5), but this is something quite common in OpenGL drivers (e.g. Ati exposing 3.2 features without supporting the whole 3.2 spec yet).

    There are some obvious compatibility issues with applications that check the ARB_shading_language extension string but fail to check if the entry points are actually available (the specs mandate you must check for both independently), but this is arguably an application issue rather than a driver one.

    Obviously, it might still not make sense to support GLSL in Mesa, but it *is* possible without bending the specs.

    Edit: a couple of questions:

    Is it possible to support glGetInteger(GL_MAJOR) and glGetInteger(GL_MINOR) for OpenGL version queries? This is an OpenGL 3.0 feature, but is trivial to support and sorely missing in older OpenGL versions (there simply isn't any way to parse the string from glGetString(GL_VERSION) with 100% confidence).

    Moreover, OpenGL 3.0 modifies glGetString to return the raw OpenGL version (e.g. '1.5') and nothing else. Mesa currently returns something moronic like '1.4 (Mesa 1.5)', which is AFAIK mandated by GLX (so it's not Mesa's fault, but this doesn't make it any less moronic). Any chance on updating glGetString in Mesa to follow the new specs?
    Last edited by BlackStar; 02 September 2009, 06:30 PM.

    Leave a comment:


  • nhaehnle
    replied
    FBOs are already supported in R300. Piglit still reports some failures, I haven't had the time to check it out.

    The problem though is that the GLSL ARB extensions use different function entry points than GLSL in OpenGL 2.0. And a lot of applications I've looked at do not bother supporting the ARB extension versions - so if they don't see GLSL, they just fall back to the ARB_v_p/ARB_f_p extensions or even fixed function pipeline, even if the GLSL extensions are present.

    That's why I wrote somewhere else that GLSL without OpenGL 2.0 doesn't make too much sense, and that in turn is why I want to focus on r300g there.

    Leave a comment:


  • BlackStar
    replied
    Originally posted by bridgman View Post
    I guess it's possible that r300 may end up as "1.5 plus GLSL" but there may be other driver enhancements required to support GLSL that I haven't realized yet.
    OpenGL 1.5 + GLSL + FBOs is more than enough for the majority of open source games and 3d programs in general. Should R300 reach that state, this can tide us over until R300g is ready.

    Of course, it's always possible that these features are better suited to a Gallium implementation and adding them to R300 would be too much work. In that case, it would probably be enough to add VBOs to R300 and leave everything else to the Gallium driver.
    Last edited by BlackStar; 02 September 2009, 03:39 PM.

    Leave a comment:


  • bridgman
    replied
    Yep. Once r300g gets a bit further it will (hopefully) become a lot easier to decide where each new feature should be implemented, and at that point it seems likely that anything more than low-hanging fruit will go into r300g. Right now there's a strong temptation to add functionality to r300 "because it is there".

    If we can continue to share the shader compiler between r300 and r300g (thank you !!) I guess it's possible that r300 may end up as "1.5 plus GLSL" but there may be other driver enhancements required to support GLSL that I haven't realized yet. I have no ideal how GLSL arrays are implemented, for example
    Last edited by bridgman; 02 September 2009, 03:10 PM.

    Leave a comment:


  • nhaehnle
    replied
    I have to say that the plans outlined by bridgman are all very reasonable, and I agree that it's the Right Thing for going forward.

    We've had a short discussion about r300 vs. r300g on IRC without any real conclusion. Obviously, the long term focus will be on r300g, there's no doubt about it; the question is whether classic Mesa will get OpenGL 2.0 support.

    Personally, I increasingly side towards not adding OpenGL 2.0 to the classic driver. Better to stabilize it at an OpenGL 1.5 level for the conservative folk, and then focus on getting up to OpenGL 2.1 for Gallium.

    Leave a comment:


  • bridgman
    replied
    I expect that the first step for Evergreen 3D will be adding support to the classic Mesa driver - that will get functionality into users hands much more quickly.

    Once support has been added to the classic mesa driver, anyone in the community can write "r800g".

    Leave a comment:


  • Xavier
    replied
    Originally posted by bridgman View Post
    Depends what you mean by "core work". I agree that our time is generally best spent adding support for new GPUs, since that is the one area where access to inside information can speed up understanding and troubleshooting even if the inside information can't be released.

    That said, there are also a few places where pitching in a bit of effort might be able to "push something over the hump" and make it possible for community developers to make progress on other parts of the stack. That's how I see the 3xx-5xx Gallium3D driver, and why I think it's worthy of some time.
    Exactely. r300g is both infrastructure (looking to improving Gallium3D) and porting work, so it's best done by the community but can be done only on that kind of hardware: many years old (r300 is 7 years old) and well-known by people outside AMD.

    But an r800g driver could only be made by AMD (otherwise it'd be ready in 2016), that's why I hope you'll manage to get it out on time.

    Leave a comment:

Working...
X