Announcement

Collapse
No announcement yet.

Intel Hits "Almost There" GL 3.0 Support In Mesa

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by cl333r View Post
    Some people need to get a life and get emotionally stable.
    OpenGL isn't a horrible solution (fortran and cobol are), however there are people with a horrible mood.
    ...
    What the hell? I want to smoke some of that too!

    You managed mix emotions with a graphics API and programming languages that have nothing to do with the problem at hand, all in a single sentence, congrats.

    Your post shows tons of ignorance regarding development of real world applications and games. Please, instead of hiking so much, invest more time in reading about threading, OpenGL and Direct3D. Please also read on why "serious" titles use an already made engine instead of talking to OGL/D3D directly. Just because you can, doesn't mean it's the best choice. Just because OGL can render triangles or tesselate meshes, doesn't mean it does it the best way.

    As for your D3D11 games looking similar to D3D9 games... well, I can play Pong with the latest gfx cards too...

    Comment


    • #17
      Originally posted by mdias View Post
      What the hell? I want to smoke some of that too!

      You managed mix emotions with a graphics API and programming languages that have nothing to do with the problem at hand, all in a single sentence, congrats.

      Your post shows tons of ignorance regarding development of real world applications and games. Please, instead of hiking so much, invest more time in reading about threading, OpenGL and Direct3D. Please also read on why "serious" titles use an already made engine instead of talking to OGL/D3D directly. Just because you can, doesn't mean it's the best choice. Just because OGL can render triangles or tesselate meshes, doesn't mean it does it the best way.

      As for your D3D11 games looking similar to D3D9 games... well, I can play Pong with the latest gfx cards too...
      My example about fortran is obvious and you probably know what I mean, if you still don't get it - ask me.

      "What the hell" is proof of emotionally unstable behavior - as to "serious" people - I already said what Carmack thinks.
      As to tons of ignorance - show me effects that can't be done with GL 4.2 but can be done with DX 11 - or will you ignore this central point and continue bitching about threads?
      As to threading - I'll repeat - since when is it so difficult that you have to bitch about it?
      And your Pong example as a response to my Crysis 2 example (with latest updates to the nvidia driver and to the game itself which still sucks and even freezes despite being a "serious" title) is plain goofy, which proves that your emotions (and maybe ignorance) are talking for you. Again show me effects that DX11 does that GL 4.2 can't do.
      Last edited by cl333r; 12-22-2011, 05:24 AM.

      Comment


      • #18
        @cl333r, please stop bullshitting and insulting people. I like elanthis' informative posts, and I don't(neither is he) argue, you can't do stuff with OGL, you can with D3D. It is just an outdated API. Yes, you can write in C your own thread management. Yes, you can write in C++ your own thread management, but still they continue to evolve C++ standard with 0x11 to include native threading.
        You sound as ignorant as Q, already.

        PS: And Carmack said D3D was worse in the Quake times, but now is better than OGL, and he would use it, if it was it's legacy in knowledge, development tools, etc.

        Comment


        • #19
          Originally posted by Drago View Post
          @cl333r, please stop bullshitting and insulting people. I like elanthis' informative posts, and I don't(neither is he) argue, you can't do stuff with OGL, you can with D3D. It is just an outdated API. Yes, you can write in C your own thread management. Yes, you can write in C++ your own thread management, but still they continue to evolve C++ standard with 0x11 to include native threading.
          You sound as ignorant as Q, already.

          PS: And Carmack said D3D was worse in the Quake times, but now is better than OGL, and he would use it, if it was it's legacy in knowledge, development tools, etc.
          PPS: In other words Carmack said that DX isn't good enough to justify the transition.

          I'm not ignorant, I'm just pointing out that something that isn't critical and doesn't make your program faster or allow you to do richer effects - is indirectly advertised otherwise.
          Going to DX11 is stupid because you have to do sync with Microsoft, if you don't then you have to get a lot of people to agree on a single new standard (on a fork of DX) - again - a crappy road to follow.

          The best choice obviously is deprecating old stuff in GL and introducing new features - which is what GL is doing - it already deprecated fixed pipe funcs and introduced a lot of features, and rest assured more will follow.

          Also, going to DX11 and dropping GL will take a lot of effort and over 10 years to accomplish (transition existing software/games/etc) - which could fail in the end for different reasons, like not getting enough market/mind share or support from devs/corps/driver devs - it the meantime there might be a shift in hw development which could require rewriting the API again. Trust me, in this case given all the unknown variables - improving GL in a reasonable fashion is the best solution.

          And I'm not bullying or something since I'm not calling anyone crazy names and not using censored words either.
          Last edited by cl333r; 12-22-2011, 06:06 AM.

          Comment


          • #20
            elanthis knows what he is talking about. His post sums up pretty much everything that is wrong with OpenGL. My experience is very similar.

            In short:

            1. OpenGL is a horrible API. Inconsistencies all over the place: glGenTextures vs glCreateShader vs glNewList; glBindTexture vs glUseProgram.

            2. Its specifications are monstrous, inconsistent and downright buggy. There are features where no two vendors agree on the implementation, because the specs are self-contradicting. It is downright impossible to implement! Even trivial 7-year old stuff like uniform arrays is implemented differently on Ati, Intel and Nvidia. Try this if you don't believe me: create a uniform array with length=1 and try to fill it on all three vendors. Go ahead!

            For extra fun, try the same with a length=1 varying array.

            3. OpenGL is designed by an inept commitee with diverging visions for the future. GL3.0 was to be released along with DX10, and the initial API drops showed a tremendous improvements over GL2.1. Then Khronos disappeared for a year and finally showed up the following summer (complete radio silence till then) with a different GL3.0 version that was identical to GL2.1+extensions. Literally, they just folded GL2.1-level extensions to core and called that GL3.0. Oh yeah, and they added a deprecation model that's respected only by Apple (with a 3-year delay).

            During this year-of-silence, the opengl.org forums reflected the communities reactions: at first, people where asking for information. Later they started becoming angry. Finally, they left - and most never returned. Opengl.org is now but a ghost of its former days.

            4. OpenGL carries a 20-year old legacy that's reached the point where it is impossible to add new features to follow GPU development. Compute shaders required a whole new API (OpenCL) that's completely independent from OpenGL.

            5. D3D11 offers features that cannot be implemented in OpenGL 4.x, period. Proper threading, for instance. Refer to elanthis' post for more information.

            Even something as simple as asynchronous resource creation becomes impossible in OpenGL. Some drivers use global locks (create resource in thread #2, stall rendering in thread #1). Other drivers don't offer threading at all (create resource in thread #2, crash). If you are lucky this might work - if you aren't, you might use the resource before it is fully created (undefined results, have fun).

            D3D11 is better than GL4.x in absolutely all regards, which is why there are no games written with GL3.x or 4.x in mind. They either use GL2.1 (e.g. Rage3d) or have moved to OpenGL ES 2.0 which is a cleaned up version of GL2.1.
            Last edited by BlackStar; 12-22-2011, 06:19 AM.

            Comment


            • #21
              "What the hell" comes from me not expecting such a misleading post after the very accurate elanthis' post.
              But ok, I'll try to be a little bit more serious, if that's what makes you happy and proves you my point.

              Please re-read what Carmack said. He did not mean D3D isn't good enough to justify transition, he meant that it would be too much hassle since he already has an established workflow around OGL. Please note that he also said he's loving the work he's doing on the x360.

              Application performance, while not theoretically connected diretcly to being OpenGL or Direct3D, still varies because of the simple fact that Direct3D reflects more acurately the actual hardware inner workings and driver developers pay much more attention to Direct3D for obvious reasons.

              Blaming D3D11 for Crysis 2 crashing/freezing is like blaming an operating system for closing an aplication that tries to dereference a null pointer.

              I don't know how you got from my Pong analogy to me being emotional, but I admire your creativity.

              Let me dumb it down for you: A game can use D3D11 and look like shit, that doesn't mean D3D11 is to blame.

              OpenGL has been playing catch-up with D3D for some years now. Khronos knows it, developers know it, 99% of the people in the technology world know it. Could it be that we're all wrong?

              Nobody ever said OpenGL couldn't show the same thing as D3D11. But there's more to a graphics API than meets the eye.

              Comment


              • #22
                Originally posted by Drago View Post
                You sound as ignorant as Q, already.
                .
                I'm the reverence of ignorance ?

                Comment


                • #23
                  Originally posted by mdias View Post
                  "What the hell" comes from me not expecting such a misleading post after the very accurate elanthis' post.
                  But ok, I'll try to be a little bit more serious, if that's what makes you happy and proves you my point.

                  Please re-read what Carmack said. He did not mean D3D isn't good enough to justify transition, he meant that it would be too much hassle since he already has an established workflow around OGL. Please note that he also said he's loving the work he's doing on the x360.

                  Application performance, while not theoretically connected diretcly to being OpenGL or Direct3D, still varies because of the simple fact that Direct3D reflects more acurately the actual hardware inner workings and driver developers pay much more attention to Direct3D for obvious reasons.

                  Blaming D3D11 for Crysis 2 crashing/freezing is like blaming an operating system for closing an aplication that tries to dereference a null pointer.

                  I don't know how you got from my Pong analogy to me being emotional, but I admire your creativity.

                  Let me dumb it down for you: A game can use D3D11 and look like shit, that doesn't mean D3D11 is to blame.

                  OpenGL has been playing catch-up with D3D for some years now. Khronos knows it, developers know it, 99% of the people in the technology world know it. Could it be that we're all wrong?

                  Nobody ever said OpenGL couldn't show the same thing as D3D11. But there's more to a graphics API than meets the eye.
                  bla bla.. I'm giving you an example of a AAA game with DX11 with visual glitches which hangs sometimes - and you still try to make a point it's not DX11 to blame, that it must be something else, right.
                  Also, you didn't give any example that can't be accomplished with GL but can be done with DX, great - you're actually giving in that there's nothing that DX can do that GL can't. The catch up period with DX11 in terms of hw features is over too.
                  And you still want me to take you seriously after that?
                  Last edited by cl333r; 12-22-2011, 07:28 AM.

                  Comment


                  • #24
                    Originally posted by cl333r View Post
                    bla bla.. I'm giving you an example of a AAA game with DX11 with visual glitches which hangs sometimes - and you still try to make a point it's not DX11 to blame, that it must be something else, right.
                    Also, you didn't give any example that can't be accomplished with GL but can be done with DX, great - you're actually giving in that there's nothing that DX can do that GL can't. The catch up period with DX11 in terms of hw features is over too.
                    And you still want me to take you seriously after that?
                    Oh please, did you even read my post?

                    You can't easily manage more than 1 GL context, you can't easily multithread, you don't have type safety/objects for resources, you have redundant API functions, you have a BLOATED API, your documentation sucks compared to D3D.

                    Seriously, what kind of experience do you have with D3D?
                    Why do you seem so emotionally attached to OGL?
                    Can you list what are the advantages of OpenGL other than being crossplatform? Because it seems all about openness and still there's so many things wrong that people complain about and it just keeps the same.

                    Comment


                    • #25
                      Originally posted by cl333r View Post
                      bla bla..
                      I thought you were complaining that people were using emotional responses...
                      Originally posted by cl333r View Post
                      I'm giving you an example of a AAA game with DX11 with visual glitches which hangs sometimes - and you still try to make a point it's not DX11 to blame, that it must be something else, right.
                      Nowhere in your example proved that DX11 was to blame, and nowhere in the post you replied to claims that it can't be a bug in DX11, just suggesting that it's much more likely to be elsewhere. Not all DX11 games have the bugs you describe.
                      Originally posted by cl333r View Post
                      Also, you didn't give any example that can't be accomplished with GL but can be done with DX, great - you're actually giving in that there's nothing that DX can do that GL can't. The catch up period with DX11 in terms of hw features is over too.
                      I'm pretty sure the "you can't do proper threading with OpenGL" example has been mentioned more than once.
                      Originally posted by cl333r View Post
                      And you still want me to take you seriously after that?
                      I don't see how anyone could take you seriously after your last batch of comments even if they tried.

                      Comment


                      • #26
                        Originally posted by eugeni_dodonov View Post
                        I am actually unaware of any applications or games which require even gl 3.0, not speaking about gl 4.0 on Linux. So it is a bit of chicken-vs-egg problem - I don't know if such applications do not exist because nobody writing games for Linux needs gl 3.0+-specific extensions; or such developers are not writing games for Linux because it lacks such extensions in general drivers.

                        Or maybe GL 2.0+ is just enough for pretty much everyone those days .
                        Every major company in the gaming industry has been writing games for DirectX 10/11 for years already.. This would correspond to OpenGL 3.X and 4.X respectively..

                        ID Software wrote a game in OpenGL called "RAGE", but requires OpenGL 3.3 as a minimum. ID Software really didn't have any hopes of releasing the game for Linux any time soon because of the lack of graphics driver support (and graphics driver performance problems). It was released for Mac OS X, Windows, and the PS3 game console.. It's a GREAT game.

                        It's not a chicken or egg problem... It's a question of whether or not Linux and it's support libs and drivers has what it takes to be a serious gaming OS. If not, then these games just don't get released (Unreal Tournament 3, RAGE, etc.) for Linux even though they were written from the ground up in OpenGL. It's as simple as that. Game companies don't want to release a game on a platform if they think it's going to run like poo, because it can drag the reputation of the game down and hurt sales on other platforms. Or worse, nobody will buy it after going through the effort to port it to Linux.

                        Comment


                        • #27
                          Originally posted by mdias View Post
                          Oh please, did you even read my post?

                          You can't easily manage more than 1 GL context, you can't easily multithread, you don't have type safety/objects for resources, you have redundant API functions, you have a BLOATED API, your documentation sucks compared to D3D.

                          Seriously, what kind of experience do you have with D3D?
                          Why do you seem so emotionally attached to OGL?
                          Can you list what are the advantages of OpenGL other than being crossplatform? Because it seems all about openness and still there's so many things wrong that people complain about and it just keeps the same.
                          "redundant API functions" and "bloated API" is pretty much the same, just as the term "legacy stuff", most of which has been deprecated, and some stuff is still there, which will likely be gradually fixed. No revolutions, only positive evolution like we have witnessed with GL's transition from 2.1 to 4.2.

                          I'm not emotionally attached to GL, I'm saying cloning/using DX11 is likely to fail for the reasons I listed somewhere above, creating something new is too early, see explanation below.

                          GL needs to be fixed, but it's nowhere as pressing as saying screw it we're gonna create a new standard or use DX11, that's silly.

                          However, I'm in favor of rewriting GL completely by the time the next-gen (not "next-gen" as in marketing, but as in technology, like some real break-through) hw shows up, which might happen within like 5 to 15 years. So I'm not saying let's keep upgrading it forever, I'm just saying GL is good enough and for the time being abandoning it is reckless simply because of some non critical issues.

                          To me, a critical point, is when, say we draw stuff not with triangles with slapped textures on them, but with real points/atoms/whatever which might happen, as I think, in 5 to 15 years - that would be a good enough reason to rewrite it, and DX11 might need a rewrite too, this way we don't have to go through a lot of trouble by not having to force the industry through an extra rewrite of the API.
                          Last edited by cl333r; 12-22-2011, 07:57 AM.

                          Comment


                          • #28
                            Originally posted by gigaplex View Post
                            I thought you were complaining that people were using emotional responses...

                            Nowhere in your example proved that DX11 was to blame, and nowhere in the post you replied to claims that it can't be a bug in DX11, just suggesting that it's much more likely to be elsewhere. Not all DX11 games have the bugs you describe.

                            I'm pretty sure the "you can't do proper threading with OpenGL" example has been mentioned more than once.

                            I don't see how anyone could take you seriously after your last batch of comments even if they tried.
                            Sure, not all DX11 games have bugs, but some do, and DX11 drivers do have bugs, don't you know that? Or did you think DX11 implementations are bug free?
                            "Proper threading" "easy threading" is a reiteration of the same issue under different names - that you can use threads in GL if you use extra brain cycles, so it's an issue but it's not a critical one. I hope you got it now.

                            Comment


                            • #29
                              Originally posted by cl333r View Post
                              Also, you didn't give any example that can't be accomplished with GL but can be done with DX, great - you're actually giving in that there's nothing that DX can do that GL can't. The catch up period with DX11 in terms of hw features is over too.
                              And you still want me to take you seriously after that?
                              You're totally missing the point...
                              If doing something requires 100 hours to do it in OpenGL and 10 hours to do it in DirectX. Game companies aren't going to use OpenGL.. It's that simple.. Whether or not OpenGL can do things that DirectX can't is not the core problem with OpenGL. It's the core problem with gaming on Linux (due to lack of driver support) but has nothing to do with problems with OpenGL itself.



                              Originally posted by cl333r View Post
                              To me, a critical point, is when, say we draw stuff not with triangles with slapped textures on them, but with real points/atoms/whatever which might happen, as I think, in 5 to 15 years - that would be a good enough reason to rewrite it, and DX11 might need a rewrite too, this way we don't have to go through a lot of trouble by not having to force the industry through an extra rewrite of the API.
                              You mean where you supply a single very high resolution 3D model and the graphics hardware automatically breaks the 3D model into pixels/surfaces based on distance without requiring the game creator having to manually swap out high/med/low versions of the mesh at fixed distances? Yea, DirectX11 and OpenGL 4.X already has that, and the hardware that does it (Fermi) has been out for 18 months. See the Uniengine Heaven benchmark.

                              Comment


                              • #30
                                Originally posted by cl333r View Post
                                Sure, not all DX11 games have bugs, but some do, and DX11 drivers do have bugs, don't you know that? Or did you think DX11 implementations are bug free?
                                If you read my post again, you'd notice that I didn't say it can't be a bug in the drivers (of which these belong to the hardware vendors, not DX11 itself). I'm not sure how we got to blaming application and driver bugs as a result of the API itself - if we're just comparing existing bugs and using them as proof of a flawed API then the Mesa/Gallium3D code is really giving OpenGL a bad reputation.
                                Originally posted by cl333r View Post
                                "Proper threading" "easy threading" is a reiteration of the same issue under different names - that you can use threads in GL if you use extra brain cycles, so it's an issue but it's not a critical one. I hope you got it now.
                                Yes, they are the same issue under different names. I didn't claim it was multiple separate issues. It was one example, of which you claimed zero examples were given. And no, "using extra brain cycles" doesn't give OpenGL the same threading capacity. You'd be able to squeak out some threading gains after exerting significantly more development time, but internal to OpenGL are a large bunch of internal locks and global variables which cannot be worked around just by thinking harder. You can get the same pixel by pixel results between the two of them (what users care about), but from a developer perspective productivity matters (especially in commercial environments).
                                Originally posted by Sidicas View Post
                                You mean where you supply a single very high resolution 3D model and the graphics hardware automatically breaks the 3D model into pixels/surfaces based on distance without requiring the game creator having to manually swap out high/med/low versions of the mesh at fixed distances? Yea, DirectX11 and OpenGL 4.X already has that, and the hardware that does it (Fermi) has been out for 18 months. See the Uniengine Heaven benchmark.
                                No, I think cl333r meant a completely different approach along the lines of voxels. What you described was tessellation, which is an enhancement that still uses triangles under the hood.

                                Comment

                                Working...
                                X