Announcement

Collapse
No announcement yet.

Intel Hits "Almost There" GL 3.0 Support In Mesa

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by AnonymousCoward View Post
    ...at 1 fps
    You are unfair, we already reached the 3 fps goal, it's 3x times faster!
    ## VGA ##
    AMD: X1950XTX, HD3870, HD5870
    Intel: GMA45, HD3000 (Core i5 2500K)

    Comment


    • #12
      Originally posted by elanthis View Post
      GL has long just been a crappy API, but it's gotten to the point now where that API is actually making it impossible to maintain feature parity with D3D. It needs to be rewritten. GL5 really just needs to break back-compat and be a whole new object-based API. That unfortunately is not likely to happen, and we are unlikely to see a non-Microsoft graphics API that can truly compete with D3D until Apple, Google, and/or some other companies with a stake in graphics APIs gets fed up with Khronos' uselessness and pushes a competing open API standard. Or the FOSS folks push something, which Gallium is making very easy to do: there's not much stopping someone from writing a new API as a state tracker for Gallium other than a lack of will. Making such a working API and showing that it's efficient and easy to use would be a good way to get the graphics hardware vendors and "alternative" OS vendors to pay attention and get on board, too, I would think.
      didn't you say that you had something on your mind regarding a new api and you were thinking of doing it some time ago??

      Comment


      • #13
        Originally posted by Temar View Post
        Thanks for this very interesting post.



        Why not just implement a DirectX 11 state tracker instead of inventing a new API? You will probably hit the patent problem anyway, no matter if you invent a new API or implement an existing one. The advantage of implementing D3D11 would be that it is well documented and developers are already familiar with it. Also it would be much easier to play games on Linux if Wine could use a native D3D implementation instead of having to translate everything to GL.
        I think Direct3D has some window-isms built into it which make it not a great choice to use on any other OS. I think the Mesa D3D11 state tracker copied a bit of WINE code into it to get it working.

        Comment


        • #14
          i think it'll take much quicker to catch up to more recent opengl specs. probably because opengl3 introduced the most features, and some features of higher specifications are already completed. it's not like mesa developers limited themselves to implement only 3.0 feature set.

          Comment


          • #15
            Originally posted by elanthis View Post
            bla bla.. Direct3D11 is seeing much more uptake and interest because it -- unlike anything OpenGL ever has or likely ever will do -- offers very real API improvements that make a huge difference. One of those features is compute shaders, which in Khronos land are a separate API (OpenCL). The other is kick-ass threading support, which is impossible to ever offer in an equivalent way in OpenGL without completely rewriting the API.

            The threading is the big issue with OpenGL, aside from just being a generally horrible API. .. bla bla
            Some people need to get a life and get emotionally stable.
            OpenGL isn't a horrible solution (fortran and cobol are), however there are people with a horrible mood.
            People bitching about GL are generally those who are in bad mood and need bitching - the internet is the ideal place.
            You didn't discover America by saying OpenGL needs to be upgraded, and it has been upgraded with 3.0-3-3-4.2, and all emotionally stable people like it a lot, the deprecation mechanism is also reasonable cause doesn't involve reckless decisions. Bitching about threads is stupid, because under the hood it's still serialization and people with brains just add more code like one does with all other non threaded solutions which are plenty.
            DX11 paradise? Hardly. I still have BIG issues with DX11 based games (crysis2 on DX11 on GTX 560Ti often fails, while on DX9 it worked fine).
            Emotionally unstable people like to bitch how DX11 solves almost all problems and the programmers need to just hit random keys on the keyboard to create cool stuff, meanwhile I hardly notice the difference between DX9/10 with DX11 games, really, not to mention bugs in drivers, and that's while many emotinally unstable people like referring to DX11 as a bug-free solution.
            Go take a hike.
            GL 4.2 allows to accomplish every effect DX11 can, it's not just me, Carmack also said the difference isn't big enough for him to move to DX.
            I'm (still) learning GL and I like it. It's clean, it's fun. Handling textures is little weird and it's a bit verbose, but Java is more weird has a lot more deprecated stuff (I do Java for 10 years, trust me, it has a lot more obsolete stuff than GL) and Java is certainly more verbose but it's still a success in critical markets. So GL is good enough and has enough market share.
            Your big problem with GL lies like half a meter in front of the monitor and needs to get out and face the sun light a bit.
            Bitching about threads in GL is like bitching that the C/C++ backends of GTK aren't thread safe either - that isn't a problem, I use pthreads with Gtk - so what, did my brains blow cause I had to do a few more brains cycles? Since when using threads on your own has become so hard that you have to bitch about it in the forums? I don't think you're so feeble minded, I just think you need to take a hike.

            Comment


            • #16
              Originally posted by cl333r View Post
              Some people need to get a life and get emotionally stable.
              OpenGL isn't a horrible solution (fortran and cobol are), however there are people with a horrible mood.
              ...
              What the hell? I want to smoke some of that too!

              You managed mix emotions with a graphics API and programming languages that have nothing to do with the problem at hand, all in a single sentence, congrats.

              Your post shows tons of ignorance regarding development of real world applications and games. Please, instead of hiking so much, invest more time in reading about threading, OpenGL and Direct3D. Please also read on why "serious" titles use an already made engine instead of talking to OGL/D3D directly. Just because you can, doesn't mean it's the best choice. Just because OGL can render triangles or tesselate meshes, doesn't mean it does it the best way.

              As for your D3D11 games looking similar to D3D9 games... well, I can play Pong with the latest gfx cards too...

              Comment


              • #17
                Originally posted by mdias View Post
                What the hell? I want to smoke some of that too!

                You managed mix emotions with a graphics API and programming languages that have nothing to do with the problem at hand, all in a single sentence, congrats.

                Your post shows tons of ignorance regarding development of real world applications and games. Please, instead of hiking so much, invest more time in reading about threading, OpenGL and Direct3D. Please also read on why "serious" titles use an already made engine instead of talking to OGL/D3D directly. Just because you can, doesn't mean it's the best choice. Just because OGL can render triangles or tesselate meshes, doesn't mean it does it the best way.

                As for your D3D11 games looking similar to D3D9 games... well, I can play Pong with the latest gfx cards too...
                My example about fortran is obvious and you probably know what I mean, if you still don't get it - ask me.

                "What the hell" is proof of emotionally unstable behavior - as to "serious" people - I already said what Carmack thinks.
                As to tons of ignorance - show me effects that can't be done with GL 4.2 but can be done with DX 11 - or will you ignore this central point and continue bitching about threads?
                As to threading - I'll repeat - since when is it so difficult that you have to bitch about it?
                And your Pong example as a response to my Crysis 2 example (with latest updates to the nvidia driver and to the game itself which still sucks and even freezes despite being a "serious" title) is plain goofy, which proves that your emotions (and maybe ignorance) are talking for you. Again show me effects that DX11 does that GL 4.2 can't do.
                Last edited by cl333r; 22 December 2011, 06:24 AM.

                Comment


                • #18
                  @cl333r, please stop bullshitting and insulting people. I like elanthis' informative posts, and I don't(neither is he) argue, you can't do stuff with OGL, you can with D3D. It is just an outdated API. Yes, you can write in C your own thread management. Yes, you can write in C++ your own thread management, but still they continue to evolve C++ standard with 0x11 to include native threading.
                  You sound as ignorant as Q, already.

                  PS: And Carmack said D3D was worse in the Quake times, but now is better than OGL, and he would use it, if it was it's legacy in knowledge, development tools, etc.

                  Comment


                  • #19
                    Originally posted by Drago View Post
                    @cl333r, please stop bullshitting and insulting people. I like elanthis' informative posts, and I don't(neither is he) argue, you can't do stuff with OGL, you can with D3D. It is just an outdated API. Yes, you can write in C your own thread management. Yes, you can write in C++ your own thread management, but still they continue to evolve C++ standard with 0x11 to include native threading.
                    You sound as ignorant as Q, already.

                    PS: And Carmack said D3D was worse in the Quake times, but now is better than OGL, and he would use it, if it was it's legacy in knowledge, development tools, etc.
                    PPS: In other words Carmack said that DX isn't good enough to justify the transition.

                    I'm not ignorant, I'm just pointing out that something that isn't critical and doesn't make your program faster or allow you to do richer effects - is indirectly advertised otherwise.
                    Going to DX11 is stupid because you have to do sync with Microsoft, if you don't then you have to get a lot of people to agree on a single new standard (on a fork of DX) - again - a crappy road to follow.

                    The best choice obviously is deprecating old stuff in GL and introducing new features - which is what GL is doing - it already deprecated fixed pipe funcs and introduced a lot of features, and rest assured more will follow.

                    Also, going to DX11 and dropping GL will take a lot of effort and over 10 years to accomplish (transition existing software/games/etc) - which could fail in the end for different reasons, like not getting enough market/mind share or support from devs/corps/driver devs - it the meantime there might be a shift in hw development which could require rewriting the API again. Trust me, in this case given all the unknown variables - improving GL in a reasonable fashion is the best solution.

                    And I'm not bullying or something since I'm not calling anyone crazy names and not using censored words either.
                    Last edited by cl333r; 22 December 2011, 07:06 AM.

                    Comment


                    • #20
                      elanthis knows what he is talking about. His post sums up pretty much everything that is wrong with OpenGL. My experience is very similar.

                      In short:

                      1. OpenGL is a horrible API. Inconsistencies all over the place: glGenTextures vs glCreateShader vs glNewList; glBindTexture vs glUseProgram.

                      2. Its specifications are monstrous, inconsistent and downright buggy. There are features where no two vendors agree on the implementation, because the specs are self-contradicting. It is downright impossible to implement! Even trivial 7-year old stuff like uniform arrays is implemented differently on Ati, Intel and Nvidia. Try this if you don't believe me: create a uniform array with length=1 and try to fill it on all three vendors. Go ahead!

                      For extra fun, try the same with a length=1 varying array.

                      3. OpenGL is designed by an inept commitee with diverging visions for the future. GL3.0 was to be released along with DX10, and the initial API drops showed a tremendous improvements over GL2.1. Then Khronos disappeared for a year and finally showed up the following summer (complete radio silence till then) with a different GL3.0 version that was identical to GL2.1+extensions. Literally, they just folded GL2.1-level extensions to core and called that GL3.0. Oh yeah, and they added a deprecation model that's respected only by Apple (with a 3-year delay).

                      During this year-of-silence, the opengl.org forums reflected the communities reactions: at first, people where asking for information. Later they started becoming angry. Finally, they left - and most never returned. Opengl.org is now but a ghost of its former days.

                      4. OpenGL carries a 20-year old legacy that's reached the point where it is impossible to add new features to follow GPU development. Compute shaders required a whole new API (OpenCL) that's completely independent from OpenGL.

                      5. D3D11 offers features that cannot be implemented in OpenGL 4.x, period. Proper threading, for instance. Refer to elanthis' post for more information.

                      Even something as simple as asynchronous resource creation becomes impossible in OpenGL. Some drivers use global locks (create resource in thread #2, stall rendering in thread #1). Other drivers don't offer threading at all (create resource in thread #2, crash). If you are lucky this might work - if you aren't, you might use the resource before it is fully created (undefined results, have fun).

                      D3D11 is better than GL4.x in absolutely all regards, which is why there are no games written with GL3.x or 4.x in mind. They either use GL2.1 (e.g. Rage3d) or have moved to OpenGL ES 2.0 which is a cleaned up version of GL2.1.
                      Last edited by BlackStar; 22 December 2011, 07:19 AM.

                      Comment

                      Working...
                      X