Announcement

Collapse
No announcement yet.

Intel Hits "Almost There" GL 3.0 Support In Mesa

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    "What the hell" comes from me not expecting such a misleading post after the very accurate elanthis' post.
    But ok, I'll try to be a little bit more serious, if that's what makes you happy and proves you my point.

    Please re-read what Carmack said. He did not mean D3D isn't good enough to justify transition, he meant that it would be too much hassle since he already has an established workflow around OGL. Please note that he also said he's loving the work he's doing on the x360.

    Application performance, while not theoretically connected diretcly to being OpenGL or Direct3D, still varies because of the simple fact that Direct3D reflects more acurately the actual hardware inner workings and driver developers pay much more attention to Direct3D for obvious reasons.

    Blaming D3D11 for Crysis 2 crashing/freezing is like blaming an operating system for closing an aplication that tries to dereference a null pointer.

    I don't know how you got from my Pong analogy to me being emotional, but I admire your creativity.

    Let me dumb it down for you: A game can use D3D11 and look like shit, that doesn't mean D3D11 is to blame.

    OpenGL has been playing catch-up with D3D for some years now. Khronos knows it, developers know it, 99% of the people in the technology world know it. Could it be that we're all wrong?

    Nobody ever said OpenGL couldn't show the same thing as D3D11. But there's more to a graphics API than meets the eye.

    Comment


    • #22
      Originally posted by mdias View Post
      "What the hell" comes from me not expecting such a misleading post after the very accurate elanthis' post.
      But ok, I'll try to be a little bit more serious, if that's what makes you happy and proves you my point.

      Please re-read what Carmack said. He did not mean D3D isn't good enough to justify transition, he meant that it would be too much hassle since he already has an established workflow around OGL. Please note that he also said he's loving the work he's doing on the x360.

      Application performance, while not theoretically connected diretcly to being OpenGL or Direct3D, still varies because of the simple fact that Direct3D reflects more acurately the actual hardware inner workings and driver developers pay much more attention to Direct3D for obvious reasons.

      Blaming D3D11 for Crysis 2 crashing/freezing is like blaming an operating system for closing an aplication that tries to dereference a null pointer.

      I don't know how you got from my Pong analogy to me being emotional, but I admire your creativity.

      Let me dumb it down for you: A game can use D3D11 and look like shit, that doesn't mean D3D11 is to blame.

      OpenGL has been playing catch-up with D3D for some years now. Khronos knows it, developers know it, 99% of the people in the technology world know it. Could it be that we're all wrong?

      Nobody ever said OpenGL couldn't show the same thing as D3D11. But there's more to a graphics API than meets the eye.
      bla bla.. I'm giving you an example of a AAA game with DX11 with visual glitches which hangs sometimes - and you still try to make a point it's not DX11 to blame, that it must be something else, right.
      Also, you didn't give any example that can't be accomplished with GL but can be done with DX, great - you're actually giving in that there's nothing that DX can do that GL can't. The catch up period with DX11 in terms of hw features is over too.
      And you still want me to take you seriously after that?
      Last edited by cl333r; 22 December 2011, 08:28 AM.

      Comment


      • #23
        Originally posted by cl333r View Post
        bla bla.. I'm giving you an example of a AAA game with DX11 with visual glitches which hangs sometimes - and you still try to make a point it's not DX11 to blame, that it must be something else, right.
        Also, you didn't give any example that can't be accomplished with GL but can be done with DX, great - you're actually giving in that there's nothing that DX can do that GL can't. The catch up period with DX11 in terms of hw features is over too.
        And you still want me to take you seriously after that?
        Oh please, did you even read my post?

        You can't easily manage more than 1 GL context, you can't easily multithread, you don't have type safety/objects for resources, you have redundant API functions, you have a BLOATED API, your documentation sucks compared to D3D.

        Seriously, what kind of experience do you have with D3D?
        Why do you seem so emotionally attached to OGL?
        Can you list what are the advantages of OpenGL other than being crossplatform? Because it seems all about openness and still there's so many things wrong that people complain about and it just keeps the same.

        Comment


        • #24
          Originally posted by cl333r View Post
          bla bla..
          I thought you were complaining that people were using emotional responses...
          Originally posted by cl333r View Post
          I'm giving you an example of a AAA game with DX11 with visual glitches which hangs sometimes - and you still try to make a point it's not DX11 to blame, that it must be something else, right.
          Nowhere in your example proved that DX11 was to blame, and nowhere in the post you replied to claims that it can't be a bug in DX11, just suggesting that it's much more likely to be elsewhere. Not all DX11 games have the bugs you describe.
          Originally posted by cl333r View Post
          Also, you didn't give any example that can't be accomplished with GL but can be done with DX, great - you're actually giving in that there's nothing that DX can do that GL can't. The catch up period with DX11 in terms of hw features is over too.
          I'm pretty sure the "you can't do proper threading with OpenGL" example has been mentioned more than once.
          Originally posted by cl333r View Post
          And you still want me to take you seriously after that?
          I don't see how anyone could take you seriously after your last batch of comments even if they tried.

          Comment


          • #25
            Originally posted by eugeni_dodonov View Post
            I am actually unaware of any applications or games which require even gl 3.0, not speaking about gl 4.0 on Linux. So it is a bit of chicken-vs-egg problem - I don't know if such applications do not exist because nobody writing games for Linux needs gl 3.0+-specific extensions; or such developers are not writing games for Linux because it lacks such extensions in general drivers.

            Or maybe GL 2.0+ is just enough for pretty much everyone those days .
            Every major company in the gaming industry has been writing games for DirectX 10/11 for years already.. This would correspond to OpenGL 3.X and 4.X respectively..

            ID Software wrote a game in OpenGL called "RAGE", but requires OpenGL 3.3 as a minimum. ID Software really didn't have any hopes of releasing the game for Linux any time soon because of the lack of graphics driver support (and graphics driver performance problems). It was released for Mac OS X, Windows, and the PS3 game console.. It's a GREAT game.

            It's not a chicken or egg problem... It's a question of whether or not Linux and it's support libs and drivers has what it takes to be a serious gaming OS. If not, then these games just don't get released (Unreal Tournament 3, RAGE, etc.) for Linux even though they were written from the ground up in OpenGL. It's as simple as that. Game companies don't want to release a game on a platform if they think it's going to run like poo, because it can drag the reputation of the game down and hurt sales on other platforms. Or worse, nobody will buy it after going through the effort to port it to Linux.

            Comment


            • #26
              Originally posted by mdias View Post
              Oh please, did you even read my post?

              You can't easily manage more than 1 GL context, you can't easily multithread, you don't have type safety/objects for resources, you have redundant API functions, you have a BLOATED API, your documentation sucks compared to D3D.

              Seriously, what kind of experience do you have with D3D?
              Why do you seem so emotionally attached to OGL?
              Can you list what are the advantages of OpenGL other than being crossplatform? Because it seems all about openness and still there's so many things wrong that people complain about and it just keeps the same.
              "redundant API functions" and "bloated API" is pretty much the same, just as the term "legacy stuff", most of which has been deprecated, and some stuff is still there, which will likely be gradually fixed. No revolutions, only positive evolution like we have witnessed with GL's transition from 2.1 to 4.2.

              I'm not emotionally attached to GL, I'm saying cloning/using DX11 is likely to fail for the reasons I listed somewhere above, creating something new is too early, see explanation below.

              GL needs to be fixed, but it's nowhere as pressing as saying screw it we're gonna create a new standard or use DX11, that's silly.

              However, I'm in favor of rewriting GL completely by the time the next-gen (not "next-gen" as in marketing, but as in technology, like some real break-through) hw shows up, which might happen within like 5 to 15 years. So I'm not saying let's keep upgrading it forever, I'm just saying GL is good enough and for the time being abandoning it is reckless simply because of some non critical issues.

              To me, a critical point, is when, say we draw stuff not with triangles with slapped textures on them, but with real points/atoms/whatever which might happen, as I think, in 5 to 15 years - that would be a good enough reason to rewrite it, and DX11 might need a rewrite too, this way we don't have to go through a lot of trouble by not having to force the industry through an extra rewrite of the API.
              Last edited by cl333r; 22 December 2011, 08:57 AM.

              Comment


              • #27
                Originally posted by gigaplex View Post
                I thought you were complaining that people were using emotional responses...

                Nowhere in your example proved that DX11 was to blame, and nowhere in the post you replied to claims that it can't be a bug in DX11, just suggesting that it's much more likely to be elsewhere. Not all DX11 games have the bugs you describe.

                I'm pretty sure the "you can't do proper threading with OpenGL" example has been mentioned more than once.

                I don't see how anyone could take you seriously after your last batch of comments even if they tried.
                Sure, not all DX11 games have bugs, but some do, and DX11 drivers do have bugs, don't you know that? Or did you think DX11 implementations are bug free?
                "Proper threading" "easy threading" is a reiteration of the same issue under different names - that you can use threads in GL if you use extra brain cycles, so it's an issue but it's not a critical one. I hope you got it now.

                Comment


                • #28
                  Originally posted by cl333r View Post
                  Also, you didn't give any example that can't be accomplished with GL but can be done with DX, great - you're actually giving in that there's nothing that DX can do that GL can't. The catch up period with DX11 in terms of hw features is over too.
                  And you still want me to take you seriously after that?
                  You're totally missing the point...
                  If doing something requires 100 hours to do it in OpenGL and 10 hours to do it in DirectX. Game companies aren't going to use OpenGL.. It's that simple.. Whether or not OpenGL can do things that DirectX can't is not the core problem with OpenGL. It's the core problem with gaming on Linux (due to lack of driver support) but has nothing to do with problems with OpenGL itself.



                  Originally posted by cl333r View Post
                  To me, a critical point, is when, say we draw stuff not with triangles with slapped textures on them, but with real points/atoms/whatever which might happen, as I think, in 5 to 15 years - that would be a good enough reason to rewrite it, and DX11 might need a rewrite too, this way we don't have to go through a lot of trouble by not having to force the industry through an extra rewrite of the API.
                  You mean where you supply a single very high resolution 3D model and the graphics hardware automatically breaks the 3D model into pixels/surfaces based on distance without requiring the game creator having to manually swap out high/med/low versions of the mesh at fixed distances? Yea, DirectX11 and OpenGL 4.X already has that, and the hardware that does it (Fermi) has been out for 18 months. See the Uniengine Heaven benchmark.

                  Comment


                  • #29
                    Originally posted by cl333r View Post
                    Sure, not all DX11 games have bugs, but some do, and DX11 drivers do have bugs, don't you know that? Or did you think DX11 implementations are bug free?
                    If you read my post again, you'd notice that I didn't say it can't be a bug in the drivers (of which these belong to the hardware vendors, not DX11 itself). I'm not sure how we got to blaming application and driver bugs as a result of the API itself - if we're just comparing existing bugs and using them as proof of a flawed API then the Mesa/Gallium3D code is really giving OpenGL a bad reputation.
                    Originally posted by cl333r View Post
                    "Proper threading" "easy threading" is a reiteration of the same issue under different names - that you can use threads in GL if you use extra brain cycles, so it's an issue but it's not a critical one. I hope you got it now.
                    Yes, they are the same issue under different names. I didn't claim it was multiple separate issues. It was one example, of which you claimed zero examples were given. And no, "using extra brain cycles" doesn't give OpenGL the same threading capacity. You'd be able to squeak out some threading gains after exerting significantly more development time, but internal to OpenGL are a large bunch of internal locks and global variables which cannot be worked around just by thinking harder. You can get the same pixel by pixel results between the two of them (what users care about), but from a developer perspective productivity matters (especially in commercial environments).
                    Originally posted by Sidicas View Post
                    You mean where you supply a single very high resolution 3D model and the graphics hardware automatically breaks the 3D model into pixels/surfaces based on distance without requiring the game creator having to manually swap out high/med/low versions of the mesh at fixed distances? Yea, DirectX11 and OpenGL 4.X already has that, and the hardware that does it (Fermi) has been out for 18 months. See the Uniengine Heaven benchmark.
                    No, I think cl333r meant a completely different approach along the lines of voxels. What you described was tessellation, which is an enhancement that still uses triangles under the hood.

                    Comment


                    • #30
                      Originally posted by cl333r View Post
                      "redundant API functions" and "bloated API" is pretty much the same, just as the term "legacy stuff", most of which has been deprecated, and some stuff is still there, which will likely be gradually fixed. No revolutions, only positive evolution like we have witnessed with GL's transition from 2.1 to 4.2.

                      I'm not emotionally attached to GL, I'm saying cloning/using DX11 is likely to fail for the reasons I listed somewhere above, creating something new is too early, see explanation below.

                      GL needs to be fixed, but it's nowhere as pressing as saying screw it we're gonna create a new standard or use DX11, that's silly.

                      However, I'm in favor of rewriting GL completely by the time the next-gen (not "next-gen" as in marketing, but as in technology, like some real break-through) hw shows up, which might happen within like 5 to 15 years. So I'm not saying let's keep upgrading it forever, I'm just saying GL is good enough and for the time being abandoning it is reckless simply because of some non critical issues.

                      To me, a critical point, is when, say we draw stuff not with triangles with slapped textures on them, but with real points/atoms/whatever which might happen, as I think, in 5 to 15 years - that would be a good enough reason to rewrite it, and DX11 might need a rewrite too, this way we don't have to go through a lot of trouble by not having to force the industry through an extra rewrite of the API.
                      Happy to see a more serious post.

                      I don't think people want to clone D3D11 (please note that D3D is not the same thing as DX). What people are saying is that OpenGL fits no more, and D3D11 is a nice example of how a good API should be.

                      As you say, OGL needs to be fixed. And actually, it's so urgent that it's almost already too late. Why? Because of GP-GPUs. OpenGL and most of Direct3D is tied to the current paradigm of polygon rasterization used in most (all?) games/apps out there. But rasterization has it's limits, and currently we waste tons of "brain cycles" figuring out tricks to go around these limitations and still improve graphics. Things like simple reflections on materials are pure "magic" tricks that you don't seem aware of.

                      We will not be building geometry mimicking real behaviour of atoms and so on for quiet some time (decades, probably...). The next step is realtime raytracing. GP-GPUs are bringing us closer and closer to being able to render very photorealistic frames faster and faster. You don't need very complicated tricks to render perfect reflections, depth of field and other realistic effects. OpenGL covers nothing here. OpenCL does. It's only a matter of time before OpenGL as it is dies. Meanwhile it has wasted people brain cycles that could be better invested in making actual functionality (gameplay for example) better.

                      Actually, the term GPU will probably die in your 5-15 years timespan as it merges with the common CPU (look at AMD and Intel with their new "GPUs" integrated on the CPUs).

                      From then on, people will no longer build software around a (less and less) limited pipeline like current GPU offerings have. People will create their own custom software pipelines that will give "unlimited" flexibility.

                      Coming back to today:
                      Today we are wasting brain cycles because OpenGL evolving too slowly.

                      Comment

                      Working...
                      X