Announcement

Collapse
No announcement yet.

GPU Software Fallbacks: What's Better?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by zanny View Post
    GL is better in most cases in theory because it supports extensions. If you don't get a, say, OpenGL 4.2 context for some feature you want, you can still poll if the extensions you use are available on their own.

    In DirectX you either have complete version support or not. It makes the development more straightforward, but you can more finely optimize an OpenGL game across all hardware if you conditionally use optimizations that may or may not be available (like indirect draws).

    In practice, it takes a lot of extra work to do that, it means vendors are lazy about implementing whole versions because they can cherry pick the extensions to implement, and the disparate implementations across vendors break all over the place. In DX's advantage, you can do more broad engine branching - tesselation? DX11. Geometry? DX10. Otherwise 9. With GL it could be any version from 2.1 to 4.4 with any number of features missing.
    It's not just laziness. The driver situation with OpenGL is so dire that you cannot even rely on an extension unless you test that specific driver revision first. Just look at the size of the driver blacklist in Firefox: https://wiki.mozilla.org/Blocklistin...aphics_Drivers - this is pretty much impossible to do if you are a game developer.

    The situation with OpenGL ES is even worse, to the point where even trivial operations (clear screen, or lock/unlock) give different results on different GPUs.

    There is a reason why game developers are asking for D3D feature levels and a driver verification process in OpenGL. The ball is in Khronos' court (and they have been consistently dropping it for a decade now.)
    Last edited by BlackStar; 24 June 2014, 03:16 AM.

    Comment


    • #32
      Originally posted by Ericg View Post
      Read the above. With OpenGL you can claim "Support" if you manage to back the feature up with software rendering. Feature levels mandate that if you say you "support it" then you HAVE TO be able to do it ALL in-hardware. "Can't do it in hardware? Too bad for you, stop lying and screw off."
      Feature levels do not mandate anything, if anything does mandate that it's Microsoft's policies. And I king of doubt that they actualy do. At least in the D3D8 days ATI used to lie about having HW T&L on the Radeon 7200 without any problem.

      If microsoft cared about OpenGL they would not certify OpenGL drivers with software fallbacks either. On the contrary, feature levels encourage lying as there is no way for the driver to comunicate partial support for a feature level otherwise.

      For example on OpenGL modern GPUs are not actualy required to lie about 4.4 compatibility, they could advertise 4.2 + 99% of the extensions from 4.3 and 4.4 as there is no desktop hardware yet that implements Ericsson Texture Compression (ETC) and hence GL_ARB_ES3_compatibility in hardware.

      The advantage of D3D is that HW manufacturers will develop the hardware specificaly for meeting a D3D feature level.

      The real problem is that D3D feature levels do not match OpenGL versions so the vendors choose to target the more popular D3D (or for mobiles GLES) and lie about desktop OpenGL compatibility.

      Comment


      • #33
        Originally posted by Ansla View Post
        Feature levels do not mandate anything, if anything does mandate that it's Microsoft's policies. And I king of doubt that they actualy do. At least in the D3D8 days ATI used to lie about having HW T&L on the Radeon 7200 without any problem.
        You do realize that Microsoft completely overhauled their videodriver architecture and certification process with Vista right? They got tired of graphics drivers screwing things up and causing BSoDs and by requiring drivers to be signed yeah they do have to meet Microsoft's requirements

        Originally posted by Ansla View Post
        If microsoft cared about OpenGL they would not certify OpenGL drivers with software fallbacks either. On the contrary, feature levels encourage lying as there is no way for the driver to comunicate partial support for a feature level otherwise.
        Feature levels which are an exclusive DirectX feature as a statement on which DirectX level they actually support in hardware and which only cares about directx somehow encouraging vendors to lie about software fallbacks on OpenGL. Do you realize how utterly disingenous that statement is?

        Originally posted by Ansla View Post
        For example on OpenGL modern GPUs are not actualy required to lie about 4.4 compatibility, they could advertise 4.2 + 99% of the extensions from 4.3 and 4.4 as there is no desktop hardware yet that implements Ericsson Texture Compression (ETC) and hence GL_ARB_ES3_compatibility in hardware.
        Feature levels enforce that the vendor isn't lying about hardware support and ensure a known good baseline, if they do not support a version of a standard 100% in hardware they should not be claiming support. Period.

        Originally posted by Ansla View Post
        The advantage of D3D is that HW manufacturers will develop the hardware specificaly for meeting a D3D feature level.
        Which is a Really Good Thing(TM)

        Originally posted by Ansla View Post
        The real problem is that D3D feature levels do not match OpenGL versions so the vendors choose to target the more popular D3D (or for mobiles GLES) and lie about desktop OpenGL compatibility.
        That is completely disingenuous, FeatureLevels do not speak for OpenGL only DirectX, therefore it cannot cause lies about OpenGL. If anything DirectX feature levels indirectly help OpenGL because it ensures that things that require hardware that are a subset of both OpenGL and DirectX are done in hardware for everything else it has a null effect on OpenGL.

        Comment


        • #34
          Originally posted by Luke_Wolf View Post
          You do realize that Microsoft completely overhauled their videodriver architecture and certification process with Vista right? They got tired of graphics drivers screwing things up and causing BSoDs and by requiring drivers to be signed yeah they do have to meet Microsoft's requirements
          Software falbacks don't cause BSoDs so this is unrelated. The certification process tests for stability, not performance and is done on binary blobs received from the vendors AFAIK, so I don't see how MS could even detect a SW fallback as opposed to slow HW implementation.


          Originally posted by Luke_Wolf View Post
          Feature levels which are an exclusive DirectX feature as a statement on which DirectX level they actually support in hardware and which only cares about directx somehow encouraging vendors to lie about software fallbacks on OpenGL. Do you realize how utterly disingenous that statement is?
          My point was, if OpenGL had feature levesls and those didn't match 100% the D3D ones it would encourage vendors to lie if they only got 99% of the GL one implemented in HW.

          Originally posted by Luke_Wolf View Post
          Feature levels enforce that the vendor isn't lying about hardware support and ensure a known good baseline, if they do not support a version of a standard 100% in hardware they should not be claiming support. Period.
          Sure, but can you tell me how feature levels would be different from some realy big extensions like GL_ARB_4_1_features, GL_ARB_4_2_features, etc? Again, unless you have 100% overlap between the OpenGL ones and the D3D ones how does this help anyone?

          Originally posted by Luke_Wolf View Post
          Which is a Really Good Thing(TM)
          Good for D3D and bad for OpenGL.

          Originally posted by Luke_Wolf View Post
          That is completely disingenuous, FeatureLevels do not speak for OpenGL only DirectX, therefore it cannot cause lies about OpenGL. If anything DirectX feature levels indirectly help OpenGL because it ensures that things that require hardware that are a subset of both OpenGL and DirectX are done in hardware for everything else it has a null effect on OpenGL.
          In an ideal world, yes, D3D feature levels wouldn't affect OpenGL in any way, but in our imperfect world where 90% of the users of desktop cards use D3D and not OpenGL vendors design cards with the D3D feature levels in mind and if OpenGL requires anything extra nobody will care until the OpenGL driver team has to write the driver for it.

          There are two ways to get out of this, either vendors start taking OpenGL seriously and implement in hardware all the features required for the current and the upcoming OpenGL version or app developers start using OpenGL despite it's current shortcomings and drive OpenGL to a market share that forces vendors to take it seriously from the design stage of the chips.

          Comment


          • #35
            The problem is the GL versions historically tended to lag the DX versions. HW vendors design for DX and what they think will end up in the next OpenGL version plus any vendor specific features they want. The problem comes when the OpenGL version gets radified, some of the extensions that end up getting pulled into the that version are based on certain vendor extensions that don't always fit the hw semantics of another vendors implementation. It doesn't really matter too much though. These days there aren't really any software fallbacks on modern PC GPUs.

            Comment


            • #36
              Originally posted by Ansla View Post
              My point was, if OpenGL had feature levesls and those didn't match 100% the D3D ones it would encourage vendors to lie if they only got 99% of the GL one implemented in HW.
              If the standards body is doing it's job and enforcing it as a strict standard no it wouldn't. because the standard would dictate that only if you supported 100% not 99% not 99.9999999999999% but 100% of the standard you are then allowed to declare a level of GL

              Originally posted by Ansla View Post
              Sure, but can you tell me how feature levels would be different from some realy big extensions like GL_ARB_4_1_features, GL_ARB_4_2_features, etc? Again, unless you have 100% overlap between the OpenGL ones and the D3D ones how does this help anyone?
              They're nothing like extensions, in fact they're the very antithesis of extensions, they're standards declarations that say that hardware X supports all of the hardware features required of Graphics Language Y version Z. They help to the extreme by allowing the developer to have a known good set of APIs as opposed to having to futz around with trying to figure out what the hardware supports, this along with driver certification would resolve a vast swath of the issues developers complain about with OpenGL.

              Originally posted by Ansla View Post
              Good for D3D and bad for OpenGL.
              No. Good for D3D, Indirectly good for OpenGL

              Originally posted by Ansla View Post
              In an ideal world, yes, D3D feature levels wouldn't affect OpenGL in any way, but in our imperfect world where 90% of the users of desktop cards use D3D and not OpenGL vendors design cards with the D3D feature levels in mind and if OpenGL requires anything extra nobody will care until the OpenGL driver team has to write the driver for it.
              Exactly nobody cares about the OpenGL side and as a result despite your disingenuous claims, there is no impetus to lie about the version supported as they're not trying to push OpenGL.

              Originally posted by Ansla View Post
              There are two ways to get out of this, either vendors start taking OpenGL seriously and implement in hardware all the features required for the current and the upcoming OpenGL version or app developers start using OpenGL despite it's current shortcomings and drive OpenGL to a market share that forces vendors to take it seriously from the design stage of the chips.
              Or maybe we stop being disingenous and recognize that a good half of the problem is Khronos and unlike Microsoft the standards have an almost completely closed development cycle, Microsoft takes DirectX seriously and talks to both Hardware vendors and game developers about how it should move DirectX forward, meanwhile nobody outside of Khronos has any idea what OpenGL 5 is going to look like, and they further lack a driver verification process, official test suite, and a strict hardware standardization, as well as lacking the guts to actually go through with the much needed Longs Peak style revamp, which they already backed out once on with OpenGL 3. My hopes at this point lie in Mantle becoming cross platform and cross vendor as well as a proper open standard as opposed to OpenGL, as I've long since learned to put as much trust in Khronos as I do in the w3c (which is to say none at all).

              Comment


              • #37
                As a developer I have to say that software fallbacks SUCK.

                Sure, they're great if all you want is a p!ssing contest between APIs and being able to claim support for FEATURE-X in MY API that YOUR API doesn't support. But what serious practical use is that?

                Meanwhile as a developer you write your code, you test your code on as much hardware as possible, you ship it.

                Then you find that someone somewhere is getting 1 FPS on a driver that does software fallbacks. And the OpenGL philosophy is that everything should work and you have no way of detecting when a software fallback is happening, so you get to play the guessing game of what-on-earth-am-I-doing-that-caused-that?

                And that's a world of suck for the developer, it's a world of suck for the user, and it doesn't matter how smug and superior anyone feels about supporting that feature via a software fallback, because your program is not fit for purpose.

                Have we all forgotten the early GL 2.0 hardware that claimed to support non-power-of-two textures but only via a software fallback? And had no way of detecting this? And bumped you down to 1 FPS if you tried to use such a texture? Please tell me how on earth this can possibly be in anyone's interest because I'm dying to find out.

                Bottom line: if you can't support a given GL_VERSION in your driver via hardware, then don't support it - just advertise the best GL_VERSION that you do support, and don't pretend otherwise.

                Comment


                • #38
                  Originally posted by Jimmy Shelter View Post
                  And the OpenGL philosophy is that everything should work
                  And that right there is a major problem, any standards body that doesn't ascribe to using strict standards isn't a standards body at all but a suggestion body and suggestion bodies only produce useless crap that almost nobody actually really follows.

                  Comment

                  Working...
                  X