Announcement

Collapse
No announcement yet.

Intel Hits "Almost There" GL 3.0 Support In Mesa

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Intel Hits "Almost There" GL 3.0 Support In Mesa

    Phoronix: Intel Hits "Almost There" GL 3.0 Support In Mesa

    There's just a week and a half left to the year, but will Intel be successful in their goal of open-source OpenGL 3.0 support in Mesa for their Sandy/Ivy Bridge hardware in 2011? It looks like they will fall just short...

    http://www.phoronix.com/vr.php?view=MTAzMTc

  • #2
    As I understand it all the real games programmers want opengl 4.x. So we still have to wait some longer, but it is really nice to see this progress.

    Comment


    • #3
      Big congrats and thanks to everyone involved!
      (Although I'll repeat myself after the commit that actually enables it...)

      The really good part seems to be that we are not just close to gl 3.0 but also 3.3! So I hope that version numbers will be increased in the subsequent releases. However, I have to admit that I have no idea whatsoever about how difficult newer glsl versions will be. 1.3 seemed like a huge pain...

      Comment


      • #4
        Originally posted by Wilfred View Post
        As I understand it all the real games programmers want opengl 4.x.
        Thanks - this is actually a very interesting questions. I am actually unaware of any applications or games which require even gl 3.0, not speaking about gl 4.0 on Linux. So it is a bit of chicken-vs-egg problem - I don't know if such applications do not exist because nobody writing games for Linux needs gl 3.0+-specific extensions; or such developers are not writing games for Linux because it lacks such extensions in general drivers.

        Or maybe GL 2.0+ is just enough for pretty much everyone those days .

        I personally think that GL 3.0/4.0-focused games and applications will start to appear on Linux as soon as Mesa will get to support GL 3.0 on its own. It will be really interesting time by then.

        Comment


        • #5
          Originally posted by phoronix View Post
          Phoronix: Intel Hits "Almost There" GL 3.0 Support In Mesa...
          ...at 1 fps Seriously though. IMHO Mesa's biggest problem isn't compliance to some GL standard, but lack of performance is.

          Comment


          • #6
            Originally posted by AnonymousCoward View Post
            ...at 1 fps
            Actually, I'd say that it runs at about 2 orders of magnitude faster than that, on some hardware which is going to be released in a couple of months, but I won't give away any additional details .

            Comment


            • #7
              Originally posted by AnonymousCoward View Post
              ...at 1 fps Seriously though. IMHO Mesa's biggest problem isn't compliance to some GL standard, but lack of performance is.
              Well, I'm of the opinion that computer software design should be as follows: Make it right, then make it fast.

              As long as we have the functional plumbing to get GL 3.0+ to render correctly, we can spend time optimizing it to be faster. What point is there to making a broken implementation run faster if all you're doing is displaying garbage output?

              Comment


              • #8
                Originally posted by Wilfred View Post
                As I understand it all the real games programmers want opengl 4.x. So we still have to wait some longer, but it is really nice to see this progress.
                Not really. We want OpenGL 3.3. 4.x gives us a small handful of features at the expense of only supporting a much smaller subset of the existing consumer GPUs in use. The two features that anyone even really cares about have been commonly available as 2.1+-compatible extensions in the proprietary GL drivers for a couple years now.

                Direct3D11 is seeing much more uptake and interest because it -- unlike anything OpenGL ever has or likely ever will do -- offers very real API improvements that make a huge difference. One of those features is compute shaders, which in Khronos land are a separate API (OpenCL). The other is kick-ass threading support, which is impossible to ever offer in an equivalent way in OpenGL without completely rewriting the API.

                The threading is the big issue with OpenGL, aside from just being a generally horrible API. OpenGL requires and depends on magic global variables for state, while D3D uses explicit object handles. With OpenGL, the only way to handle threading at all is to put all of your graphics code into critical sections, use some crazy code to switch the active context objects in certain cases, and then pray the driver handles it correctly which it isn't required to do. It's literally impossible to do efficient threaded GPU resource creation or drawing command queue generation in GL. D3D11 on the other hand allows you to create multiple context objects attached to the same device object, and those context objects are directly part of the API calls for resource management rather than being implicit global state across the entire process like GL, so each thread can manage independent resources with no application-side locking overhead or complexity. It also allows for the creation of complete command queues (more efficient and flexible than what the deprecated display lists API in GL does) in those threads, which can then be efficiently submitted by the main thread for actual rendering. Without that, threaded "rendering" is basically limited to occlusion/frustrum culling of objects and nothing else. With it, all the complex state management and rendering setup for drawn objects can also be distributed to the threads and handled in the same pass as the culling, in addition to the generation or modification of any streaming buffer state those objects need (like the uniform buffer objects you'd want to fill up with bone transformation information for animated characters, or particle state updates for physics-using particle engines that can't be done entirely on the GPU).

                GL has long just been a crappy API, but it's gotten to the point now where that API is actually making it impossible to maintain feature parity with D3D. It needs to be rewritten. GL5 really just needs to break back-compat and be a whole new object-based API. That unfortunately is not likely to happen, and we are unlikely to see a non-Microsoft graphics API that can truly compete with D3D until Apple, Google, and/or some other companies with a stake in graphics APIs gets fed up with Khronos' uselessness and pushes a competing open API standard. Or the FOSS folks push something, which Gallium is making very easy to do: there's not much stopping someone from writing a new API as a state tracker for Gallium other than a lack of will. Making such a working API and showing that it's efficient and easy to use would be a good way to get the graphics hardware vendors and "alternative" OS vendors to pay attention and get on board, too, I would think.

                Comment


                • #9
                  Thanks for this very interesting post.

                  Originally posted by elanthis View Post
                  GL has long just been a crappy API, but it's gotten to the point now where that API is actually making it impossible to maintain feature parity with D3D. It needs to be rewritten. GL5 really just needs to break back-compat and be a whole new object-based API. That unfortunately is not likely to happen, and we are unlikely to see a non-Microsoft graphics API that can truly compete with D3D until Apple, Google, and/or some other companies with a stake in graphics APIs gets fed up with Khronos' uselessness and pushes a competing open API standard. Or the FOSS folks push something, which Gallium is making very easy to do: there's not much stopping someone from writing a new API as a state tracker for Gallium other than a lack of will. Making such a working API and showing that it's efficient and easy to use would be a good way to get the graphics hardware vendors and "alternative" OS vendors to pay attention and get on board, too, I would think.
                  Why not just implement a DirectX 11 state tracker instead of inventing a new API? You will probably hit the patent problem anyway, no matter if you invent a new API or implement an existing one. The advantage of implementing D3D11 would be that it is well documented and developers are already familiar with it. Also it would be much easier to play games on Linux if Wine could use a native D3D implementation instead of having to translate everything to GL.

                  Comment


                  • #10
                    Originally posted by Temar View Post
                    Thanks for this very interesting post.


                    Why not just implement a DirectX 11 state tracker instead of inventing a new API? You will probably hit the patent problem anyway, no matter if you invent a new API or implement an existing one. The advantage of implementing D3D11 would be that it is well documented and developers are already familiar with it. Also it would be much easier to play games on Linux if Wine could use a native D3D implementation instead of having to translate everything to GL.
                    Isn't there already a d3d 11 state tracker for gallium and X? AFAIK nobody uses it.

                    Comment


                    • #11
                      Originally posted by AnonymousCoward View Post
                      ...at 1 fps
                      You are unfair, we already reached the 3 fps goal, it's 3x times faster!
                      ## VGA ##
                      AMD: X1950XTX, HD3870, HD5870
                      Intel: GMA45, HD3000 (Core i5 2500K)

                      Comment


                      • #12
                        Originally posted by elanthis View Post
                        GL has long just been a crappy API, but it's gotten to the point now where that API is actually making it impossible to maintain feature parity with D3D. It needs to be rewritten. GL5 really just needs to break back-compat and be a whole new object-based API. That unfortunately is not likely to happen, and we are unlikely to see a non-Microsoft graphics API that can truly compete with D3D until Apple, Google, and/or some other companies with a stake in graphics APIs gets fed up with Khronos' uselessness and pushes a competing open API standard. Or the FOSS folks push something, which Gallium is making very easy to do: there's not much stopping someone from writing a new API as a state tracker for Gallium other than a lack of will. Making such a working API and showing that it's efficient and easy to use would be a good way to get the graphics hardware vendors and "alternative" OS vendors to pay attention and get on board, too, I would think.
                        didn't you say that you had something on your mind regarding a new api and you were thinking of doing it some time ago??

                        Comment


                        • #13
                          Originally posted by Temar View Post
                          Thanks for this very interesting post.



                          Why not just implement a DirectX 11 state tracker instead of inventing a new API? You will probably hit the patent problem anyway, no matter if you invent a new API or implement an existing one. The advantage of implementing D3D11 would be that it is well documented and developers are already familiar with it. Also it would be much easier to play games on Linux if Wine could use a native D3D implementation instead of having to translate everything to GL.
                          I think Direct3D has some window-isms built into it which make it not a great choice to use on any other OS. I think the Mesa D3D11 state tracker copied a bit of WINE code into it to get it working.

                          Comment


                          • #14
                            i think it'll take much quicker to catch up to more recent opengl specs. probably because opengl3 introduced the most features, and some features of higher specifications are already completed. it's not like mesa developers limited themselves to implement only 3.0 feature set.

                            Comment


                            • #15
                              Originally posted by elanthis View Post
                              bla bla.. Direct3D11 is seeing much more uptake and interest because it -- unlike anything OpenGL ever has or likely ever will do -- offers very real API improvements that make a huge difference. One of those features is compute shaders, which in Khronos land are a separate API (OpenCL). The other is kick-ass threading support, which is impossible to ever offer in an equivalent way in OpenGL without completely rewriting the API.

                              The threading is the big issue with OpenGL, aside from just being a generally horrible API. .. bla bla
                              Some people need to get a life and get emotionally stable.
                              OpenGL isn't a horrible solution (fortran and cobol are), however there are people with a horrible mood.
                              People bitching about GL are generally those who are in bad mood and need bitching - the internet is the ideal place.
                              You didn't discover America by saying OpenGL needs to be upgraded, and it has been upgraded with 3.0-3-3-4.2, and all emotionally stable people like it a lot, the deprecation mechanism is also reasonable cause doesn't involve reckless decisions. Bitching about threads is stupid, because under the hood it's still serialization and people with brains just add more code like one does with all other non threaded solutions which are plenty.
                              DX11 paradise? Hardly. I still have BIG issues with DX11 based games (crysis2 on DX11 on GTX 560Ti often fails, while on DX9 it worked fine).
                              Emotionally unstable people like to bitch how DX11 solves almost all problems and the programmers need to just hit random keys on the keyboard to create cool stuff, meanwhile I hardly notice the difference between DX9/10 with DX11 games, really, not to mention bugs in drivers, and that's while many emotinally unstable people like referring to DX11 as a bug-free solution.
                              Go take a hike.
                              GL 4.2 allows to accomplish every effect DX11 can, it's not just me, Carmack also said the difference isn't big enough for him to move to DX.
                              I'm (still) learning GL and I like it. It's clean, it's fun. Handling textures is little weird and it's a bit verbose, but Java is more weird has a lot more deprecated stuff (I do Java for 10 years, trust me, it has a lot more obsolete stuff than GL) and Java is certainly more verbose but it's still a success in critical markets. So GL is good enough and has enough market share.
                              Your big problem with GL lies like half a meter in front of the monitor and needs to get out and face the sun light a bit.
                              Bitching about threads in GL is like bitching that the C/C++ backends of GTK aren't thread safe either - that isn't a problem, I use pthreads with Gtk - so what, did my brains blow cause I had to do a few more brains cycles? Since when using threads on your own has become so hard that you have to bitch about it in the forums? I don't think you're so feeble minded, I just think you need to take a hike.

                              Comment

                              Working...
                              X