Announcement

Collapse
No announcement yet.

GPU Software Fallbacks: What's Better?

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by Luke_Wolf View Post
    Everything OpenGL 2.1 requires except for 3 functions... or in short a ~10 year old GPU. No instead of being an insane asshole and trying to force the devs to deal with my hopelessly obsolete hardware, and force the hardware to do things it was never meant to do, I would do the sane thing and buy new hardware.
    Yeah I'm with Luke on this one. If my hardware didn't do at least OpenGL 3... I'd be looking to upgrade. I get that its not viable for everyone because of one reason or another, but still. Hell this Sandy Bridge laptop ive got can do 3.1 and im still looking at Haswell / Bridgewell / Kaveri for my next Laptop.

    Comment


    • #17
      Software fallbacks are the way to go!

      Of course software fallbacks are the way to go! I can't believe Gallium3D doesn't have them! After all, you can always chose to turn the software fallback off if you want speed and visual mistakes, but if you don't have the software fallback then you're stuck with the visual mistakes or black sc. So swrast gives users a choice, which is a good thing.

      Plus, it gives Linux bragging rights over M$ and Apple - look, with Linux/swrast, not only do I extend the life of the equipment, I can get it to learn new tricks that it was never designed to do. How cool is that! Stops planned hardware deprecation and forced hardware upgrades dead in their tracks. Good for the wallet, and good for the environment!

      This is coming from someone who has played the FlightGear 3D flight simulator on a 1998 laptop with Neomagic graphics. Thanks to the beauty of Mesa and swrast!

      Comment


      • #18
        Originally posted by stan View Post
        Of course software fallbacks are the way to go! I can't believe Gallium3D doesn't have them! After all, you can always chose to turn the software fallback off if you want speed and visual mistakes, but if you don't have the software fallback then you're stuck with the visual mistakes or black sc. So swrast gives users a choice, which is a good thing.
        If you read the article again, you will see that for many things Gallium3d does have them. And one of the devs states it would be easy to add more for the cases where it currently does not.

        Also, the users choice is between two broken options, between "plague and cholera" . Being forced to such a choice is not a good thing. Better to fix the shit so it just-works. Black screen isperhaps the strongest incentive for that.

        Comment


        • #19
          Originally posted by bridgman View Post
          Yep, ideally the stack would be able to pop up something like "this app is using capabilities not supported by the HW so you need to choose between (slow) SW fallbacks or rendering only what the HW can handle".

          I suspect the reason people still favor software fallbacks is that they are probably a bit less likely to encourage "tweak the app until the rendering looks OK" development which is inevitably followed by "oh crap, the incomplete rendering on HW vendor A is different from the incomplete rendering on HW vendor B". I guess the best would probably be universal agreement that drivers will never use SW fallbacks and apps will always have the ability to fall back to a lower GL level which all HW supports.
          Let me try to make this as clear as possible:

          Give us GPU feature levels.
          Give us GPU feature levels.
          Give us GPU feature levels.
          Give us GPU feature levels.
          Give us GPU feature levels.

          We have been clamoring for this since 2003. Khronos have one final chance to get this right in OpenGL 5.0 - but they will most likely !@#$ this up again, like they did with OpenGL 3.0 and 4.0.

          The writing is on the wall. Between Metal, Mantle and D3D12, OpenGL will have to adapt or die.

          Comment


          • #20
            Why not add a Text Message in a Display Corner that some required features only available as a software fallback. In this case the user know thy the application is slow and if he really need the application he can live with it.

            Comment


            • #21
              Vote for SW fallback

              I vote for it with the requirement that a popup comes up that notifies you or even allows you to downgrade OpenGL. If you look at the BeagleBoard site, there was a project proposal to enable OpenGL via a DSP. For Raspbery it makes sense.

              Comment


              • #22
                You won't be able to bring up a dialog while the application is running, and slamming text on top of the rendered image could be a very bad idea if it obscures what ever it is supposed to render. AMD does do this though with unsupported graphics cards.

                If it was turned in to an option to have software fallback it would need to be handled by a control panel that you set things up in before running the application.

                Comment


                • #23
                  Originally posted by stan View Post
                  Of course software fallbacks are the way to go! I can't believe Gallium3D doesn't have them! After all, you can always chose to turn the software fallback off if you want speed and visual mistakes, but if you don't have the software fallback then you're stuck with the visual mistakes or black sc. So swrast gives users a choice, which is a good thing.
                  Gallium was designed around the idea (from DX), that the GPU will be able to support in hardware, all of the features required for the API exposed. If the hardware does not expose enough hardware features for that API, you shouldn't expose it. It mainly targets DX10 class hardware, which is fine for PC hardware, but can be tricky for certain embedded parts. That said, most recent GL ES hardware supports enough features that it's generally not a problem.

                  Comment


                  • #24
                    Originally posted by BlackStar View Post
                    Let me try to make this as clear as possible:

                    Give us GPU feature levels.
                    Give us GPU feature levels.
                    Give us GPU feature levels.
                    Give us GPU feature levels.
                    Give us GPU feature levels.

                    We have been clamoring for this since 2003. Khronos have one final chance to get this right in OpenGL 5.0 - but they will most likely !@#$ this up again, like they did with OpenGL 3.0 and 4.0.

                    The writing is on the wall. Between Metal, Mantle and D3D12, OpenGL will have to adapt or die.
                    From MSDN
                    "A feature level does not imply performance, only functionality. Performance is dependent on hardware implementation."


                    So, that sounds exactlly like the same thing as implementing all of OpenGL 3 in software.

                    Comment


                    • #25
                      Originally posted by Zan Lynx View Post
                      From MSDN
                      "A feature level does not imply performance, only functionality. Performance is dependent on hardware implementation."


                      So, that sounds exactlly like the same thing as implementing all of OpenGL 3 in software.
                      The problem is, when using OpenGL, you are writing hardware accelerated graphics code. If you start software rendering a hardware access API you are violating that expectation.

                      I don't break out GLSL because I think its fun. I break it out because I need it because the host cpu is insufficient for what I'm trying to do, and I need the architecture of graphics hardware. There is a reason you are using OpenGL in the first place.

                      Comment


                      • #26
                        Originally posted by Zan Lynx View Post
                        From MSDN
                        "A feature level does not imply performance, only functionality. Performance is dependent on hardware implementation."


                        So, that sounds exactlly like the same thing as implementing all of OpenGL 3 in software.
                        No. You couldn't be any more wrong even if you tried.

                        A DX level is supported if and only if it can be accelerated in hardware. If a GPU exposes feature level 11_0, then I can use all 11_0 features without going through a software fallback. If the GPU supports most of 11_0, except for a single feature, then that GPU will not advertise feature level 11_0.

                        If a IHV tries to pull the shenanigans discussed here, ergo advertize feature level 11_0 but only support e.g. 10_1 in hardware, then Microsoft will simply reject their drivers during the certification process. It's all or nothing.

                        This is a good thing both for developers and for users.

                        Comment


                        • #27
                          How are feture levels any better then gl versions?

                          Comment


                          • #28
                            Originally posted by AJenbo View Post
                            How are feture levels any better then gl versions?
                            Originally posted by BlackStar View Post
                            No. You couldn't be any more wrong even if you tried.

                            A DX level is supported if and only if it can be accelerated in hardware. If a GPU exposes feature level 11_0, then I can use all 11_0 features without going through a software fallback. If the GPU supports most of 11_0, except for a single feature, then that GPU will not advertise feature level 11_0.

                            If a IHV tries to pull the shenanigans discussed here, ergo advertize feature level 11_0 but only support e.g. 10_1 in hardware, then Microsoft will simply reject their drivers during the certification process. It's all or nothing.

                            This is a good thing both for developers and for users.
                            Read the above. With OpenGL you can claim "Support" if you manage to back the feature up with software rendering. Feature levels mandate that if you say you "support it" then you HAVE TO be able to do it ALL in-hardware. "Can't do it in hardware? Too bad for you, stop lying and screw off."

                            Comment


                            • #29
                              Originally posted by Ericg View Post
                              Read the above. With OpenGL you can claim "Support" if you manage to back the feature up with software rendering. Feature levels mandate that if you say you "support it" then you HAVE TO be able to do it ALL in-hardware. "Can't do it in hardware? Too bad for you, stop lying and screw off."
                              So not really any better, infact GL might give a better result in some cases.

                              Comment


                              • #30
                                Originally posted by AJenbo View Post
                                So not really any better, infact GL might give a better result in some cases.
                                GL is better in most cases in theory because it supports extensions. If you don't get a, say, OpenGL 4.2 context for some feature you want, you can still poll if the extensions you use are available on their own.

                                In DirectX you either have complete version support or not. It makes the development more straightforward, but you can more finely optimize an OpenGL game across all hardware if you conditionally use optimizations that may or may not be available (like indirect draws).

                                In practice, it takes a lot of extra work to do that, it means vendors are lazy about implementing whole versions because they can cherry pick the extensions to implement, and the disparate implementations across vendors break all over the place. In DX's advantage, you can do more broad engine branching - tesselation? DX11. Geometry? DX10. Otherwise 9. With GL it could be any version from 2.1 to 4.4 with any number of features missing.

                                Comment

                                Working...
                                X