Announcement

Collapse
No announcement yet.

OpenGL ES 2.0 Support For Compiz, KWin, Cairo

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by bridgman View Post
    My guess is that default builds only expose GL, not GL ES.
    Yes, ./configure in Mesa defaults to GLES disabled.

    We need a FAQ for EGL/GLES on the Open Source Graphics Stack!

    Here'd be my questions:

    Q: Can Mesa build both desktop OpenGL and GL ES at the same time?
    A: Yes, but it's not the default (you have to pass a ./configure option).

    Q: Does the Xorg server support EGL?
    A: I don't know.

    Q: As far as Mesa's implementation of GLES, does GLES only support EGL for the platform graphics interface (and not GLX)?
    A: I don't know.

    Q: Does Xorg support EGL being used simultaneously with GLX in different applications on the same X server?
    A: I don't know.

    Q: Does Xorg support GLES being used simultaneously with GL in different applications on the same X server?
    A: Yes, I think so. Firefox WebGL uses OpenGL ES 2.0, and a compositing manager uses OpenGL 2.0. They work together. QED. This could also answer the question about EGL and GLX being used together, if EGL is the only supported interface for GLES and is in fact being used by WebGL.

    Q: Same "Does Xorg..." questions but for Wayland.
    A: I don't know.

    Q: Can you run hardware-accelerated GLX + OpenGL applications in a child X server on Wayland?
    A: I don't know.

    Q: How do I know if an application uses GLES or GL?
    A: Assuming it doesn't dynamically load the libraries at runtime, you can check with ldd. If it links against libGLESv2.so (or v1), it uses GLES. If it links against libGL.so, it uses desktop OpenGL. You can also tell at compile-time by which headers it includes from /usr/include.


    I think we need a long term support path for hardware-accelerated desktop OpenGL + GLX for both X.Org and Wayland. Mainly because there are a lot of applications out there that use it, and some of them are closed source and unmaintained, or the maintainers will refuse to change to EGL + GLES. Or maybe they're open source but developers simply lack the time to convert them to EGL + GLES.

    That said, it's entirely possible that EGL + GLES will turn out to be a better solution for free software going into the future, both because the EGL protocol is better-designed (GLX is very very old) and because GLES 2.0 appears not to contain any patented features. Where we can use EGL + GLES, we should do so -- but not at the expense of removing support for GLX or desktop GL.

    Right now I believe that we can use EGL+GLES and GLX+GL in tandem on the same X server, so we're looking good. Hopefully I'm not wrong. And hopefully that doesn't change in the future.

    Comment


    • #12
      Originally posted by V!NCENT View Post
      I'm guessing that OpenGL ES doesn't use expensive floating point? If so then this should definately be the de-facto floss OS standard IMHO.
      I was curious about this myself, so i looked it up.

      The core OpenGL ES 2.0 spec does not require it.

      It does define an extension, though, which is similar to the one required in GL3 (apparently it's less flexible but I don't know the particulars). It's supposed to be covered by the same patents, though. OES_TEXTURE_FLOAT

      Although optional, it seems it is expected to be present and so it might cause issues with applications that expect it if they aren't coded very well.

      From Khronos regarding the OES extensions:
      OpenGL ES extensions that have been approved by the Khronos OpenGL ES
      Working Group are summarized in this section. These extensions are not required
      to be supported by a conformant OpenGL ES implementation, but are expected to
      be widely available; they define functionality that is likely to move into the required
      feature set in a future version of the Specification.

      Comment


      • #13
        Originally posted by allquixotic View Post
        Q: Does the Xorg server support EGL?
        A: I don't know.
        The EGL code (which is in the mesa tree) can be built to run with X or directly over DRM. A few months ago we did some quick testing and could only get the EGL-over-X paths running but now both paths apparently work.

        The rest of the "I don't know"s still stand.
        Test signature

        Comment


        • #14
          Thanks, smitty ^^,

          What about NURB models? I thought these were part of the GL spec, but not implemented in consumer drivers. If we can't have some features then there must be some compensation, right? Or is NURB not accelerateable by the 3D engine on GPUs?

          Comment

          Working...
          X