Announcement

Collapse
No announcement yet.

We're Now Up To OpenGL 4.1; Brings New Features

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • We're Now Up To OpenGL 4.1; Brings New Features

    Phoronix: We're Now Up To OpenGL 4.1; Brings New Features

    The Khronos Group came out in mid-March to release the OpenGL 4.0 specification along with OpenGL 3.3 (to bring as many OGL4 features back to OGL3 as possible for older hardware that doesn't support OpenGL 4.0), but today from SIGGRAPH in Los Angeles they have rolled out OpenGL 4.1. The Khronos Group has now put out six ratified versions of OpenGL in less than two years and the 4.1 release adds more graphical goodies to this industry standard. OpenGL 4.1 is also joined by version 4.10 of GLSL, the GL Shading Language...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    I normally don't mind when a standards body works efficiently and doesn't get tied up for years in politics, but this is just ridiculous. It's clear that the major players are trying to up the ante on getting people to buy newer hardware and encouraging software developers to write to these new APIs.

    With the tepid attention given to FOSS graphics drivers across the board, this is bad news for open source graphics. What will we see next week? GL 5.0?

    I wish the ARB would slow down for a couple years (maybe 5 - 8 years, because we really have to give all 3 of those open source guys a chance) and let FOSS catch up. But alas, new incompatible ASICs will be developed, and the GL API will continue to get hairier and hairier. Things were fine for a few years when 2.x was the standard, but as soon as 3.0 was released, new APIs have been chunked out almost as fast as the 50-man proprietary driver teams at ATI and Nvidia can support them.

    I love innovation as much as the next guy, but this whole thing smells fishy. I doubt the ARB is out to get open source, but they are definitely out to get more sales in current-generation GPUs, and they are definitely hoping that app developers will require these new APIs in order to drive sales. The only thing we can do as application developers is to hold off on requiring GL 3.0 or later in our applications, thereby telling the ARB where to shove it, and supporting people who run FOSS drivers. That goes for the proprietary games as well as open source stuff, not to mention the visualization crowd.

    Maybe someday, or in a parallel universe where FOSS gets more attention, the rate of development on Mesa would be able to keep up the pace with the ARB. But I don't think that's remotely close to being here or now.

    Comment


    • #3
      Yes! We should stop production of new GPU for 8 years! So cheapskates can keep using their GPU for another 8 years without having to starve to get the cash to buy new ones.

      Comment


      • #4
        I don't think I understand the how you made the logical jump that recent releases from the Khronos Group are done so to influence the adoption of new hardware.

        Right now OpenGL versions are classed based the the capabilities of hardware. If next week a new class of consumer grade GPUs were released that contained some new sort of functionality(ie GPUs who's hardware is optimized for ray-tracing instead rasterization could be a option one day), then a release of OpenGL 5 would be appropriate.

        At the same time, I'm sure the Khronos Group would work to maintain as much common functionality between OpenGL 3,4, and 5. The fact that Khronos Group maintains new functionality in older classes of hardware(within the reasonable limits of the older hardware's capabilities), means to me that Khronos Group has no interest pushing the adoption of new generations of hardware.

        As for slowing down the standardization process, if the Khronos were to slow the standardization process, the only result would be the proliferation and adoption of vendor specific extensions, much like during the lifespan of OpenGL 2.x. While vendor specific extensions are one of the strengths of OpenGL, application vendors should not need to implement those extensions to provide reasonably competitive functionality.

        Comment


        • #5
          Internet Explorer : Web standards :: Mesa : OpenGL

          Comment


          • #6
            Who cares if "we're up to 4.1" if no one uses the stuff anyway.

            Comment


            • #7
              Originally posted by RealNC View Post
              Who cares if "we're up to 4.1" if no one uses the stuff anyway.
              You're right. Who cares if HTML5 and <video> was designed when nobody used it anyway? Hell, why he heck did Linus write Linux when nobody was already using it? I'm not sure why they invented spreadsheets when clearly nobody at the time was using a spreadsheet. For that matter, what the hell was Henry Ford thinking when he invented automobiles? There weren't any drivers yet! And I can't believe that Edison wasted everyone's time by discovering electricity many years before we had electric outlets and appliances to use it with.

              Logic seems to be coming in shorter supplies on Phoronix forum posts these days.

              Comment


              • #8
                Originally posted by elanthis View Post
                You're right. Who cares if HTML5 and <video> was designed when nobody used it anyway? Hell, why he heck did Linus write Linux when nobody was already using it? I'm not sure why they invented spreadsheets when clearly nobody at the time was using a spreadsheet. For that matter, what the hell was Henry Ford thinking when he invented automobiles? There weren't any drivers yet! And I can't believe that Edison wasted everyone's time by discovering electricity many years before we had electric outlets and appliances to use it with.

                Logic seems to be coming in shorter supplies on Phoronix forum posts these days.
                I understand your point in this rant, but you may want to double check a few of your scenarios.

                Henry Ford didn't invent the automobile. He implemented an assembly line manufacturing process which helped make them more affordable.

                Thomas Edison didn't discover electricity, but he did invent the light bulb.

                In the case of OpenGL 4.x, they are there to bring OpenGL's capabilities up to the same level as DirectX on Windows. People are already using the features that OpenGL 4.x introduce, but they haven't been able to do it in OpenGL until now (unless they used non-standard extensions).

                Comment


                • #9
                  I'm happy to see all of these OpenGL enhancements. Maybe the OpenGL vs. DirectX war will heat up again. I'd love to see OpenGL and similar graphics slowly marginalize DirectX.

                  Comment


                  • #10
                    Originally posted by Milyardo View Post
                    I don't think I understand the how you made the logical jump that recent releases from the Khronos Group are done so to influence the adoption of new hardware.
                    (1) Major ASIC designers (ATI, Nvidia, Intel) want to increase sales. (Axiom)
                    (2) When customers purchase new hardware, it increases sales. (Axiom)
                    (3) Major ASIC designers have a significant role in driving the development and ratification of new OpenGL standards. (Source)
                    (4) When a new major version number of OpenGL is released, it is indicative of a hardware compatibility gap: the new major version excludes one or more previous generations of hardware by all the major ASIC designers. That is: if all the extensions are to be implemented in hardware, there exists some ASIC A released prior to ASIC B where OpenGL version N works with ASIC B in fully hardware-accelerated mode, but ASIC A can only provide full hardware-accelerated support for OpenGL version N-1.x (Source)
                    (5) If an application developer uses OpenGL version N in their application, then any hardware which only supports version N-1, or N-2, etc. of OpenGL can not run the application (or can only do so in software mode, which is not suitable for real-time graphics). From (4)
                    (6) Successful, useful, or entertaining applications are demanded by users. (Axiom)
                    (7) If a particular successful, useful, or entertaining application P requires OpenGL version N, then any hardware which only supports version N-1, or N-2, etc. of OpenGL can not run the application (or can only do so in software mode, which is not suitable for real-time graphics). From (5) and (6)
                    (8) P is demanded by users. From (6) and (7)
                    (9) Hardware which supports OpenGL version N is required for P to run as intended. From (5) - (8)
                    (10) Hardware which supports OpenGL version N is demanded by users who wish to run P. From (5) - (9)
                    (11) For each customer who has demand for P, there is a nonzero probability P(x) that said customer will purchase hardware to run P. From probability theory and (10)
                    (12) P(x) increases for each instantiation of (10) with different P, if the demand applies to said customer. (This is hard to prove except empirically, but the qualification "if the demand applies to said customer" makes it nearly irrefutable via intuition: if someone wants to run numerous apps requiring OpenGL 4.0, they will prioritize their purchase of supported hardware more than they would if they wanted to run fewer or zero apps requiring OpenGL 4.0.)
                    (13) The result in (12) generates sales for major ASIC designers. Conclusion from (1) - (3) and (12)

                    Ratifying new OpenGL versions is the first software step in feeding this cycle, although it is usually also driven by hardware features.

                    Originally posted by Milyardo View Post
                    Right now OpenGL versions are classed based the the capabilities of hardware. If next week a new class of consumer grade GPUs were released that contained some new sort of functionality(ie GPUs who's hardware is optimized for ray-tracing instead rasterization could be a option one day), then a release of OpenGL 5 would be appropriate.
                    I think we're in agreement here.

                    Originally posted by Milyardo View Post
                    At the same time, I'm sure the Khronos Group would work to maintain as much common functionality between OpenGL 3,4, and 5. The fact that Khronos Group maintains new functionality in older classes of hardware(within the reasonable limits of the older hardware's capabilities), means to me that Khronos Group has no interest pushing the adoption of new generations of hardware.
                    My interpretation of the point releases to the old APIs was that they attempt to introduce software-level API improvements (better design, new extensions that can be implemented more efficiently, etc) without really pushing features. After all, if something requires hardware tessellation and you don't have a hardware tessellator, you can write any API you want but it won't magically make a hardware tessellator appear on your GPU. The point releases are gravy for developers, and maybe help them write software that makes better use of existing hardware, but that doesn't get to the heart of the problem.

                    The issue for FOSS graphics is that the development moves so glacially that even a minor point release with relatively few enhancements will take Mesa the better part of a year (or more) to support, just for the "works without errors" step. Then, if you want it to work fast, wait another year or two -- all without upgrading your hardware, remember, because the current generation of hardware isn't supported until after the next generation is out.

                    Originally posted by Milyardo View Post
                    As for slowing down the standardization process, if the Khronos were to slow the standardization process, the only result would be the proliferation and adoption of vendor specific extensions, much like during the lifespan of OpenGL 2.x. While vendor specific extensions are one of the strengths of OpenGL, application vendors should not need to implement those extensions to provide reasonably competitive functionality.
                    But if the hardware features were to slow down, so would the standardization process have to as well, I think. They could continue to release point releases -- say, 4.2, 4.3, 4.4, etc. -- but it would all be within the same generation of hardware.

                    Of course, I don't expect the hardware features to stop coming until we hit the limits of physics and have to re-think computer hardware entirely. So I am not saying that it is a practically feasible solution to slow down the ratification process or the hardware march, because the vendors depend on that revenue, and HPC / extreme gaming customers -- two vocal groups who push for these features -- care more about features than whether the driver's open source or not.

                    So in "the FOSS world", the only real solution we have is that which I posited in my first post: slow down our use of the newer versions. This is completely within the grasp of the FOSS community, unlike the hardware march. For instance, it would be a large mistake for the guys working on Clutter/Mutter/GNOME 3.0 to start requiring OpenGL 3.x or 4.x to start a GNOME desktop. They intentionally keep the API requirements fairly modest so that the FOSS drivers can run the software. They set a precedent that other app developers should emulate. Which is why I will do the same in my own software.

                    I just hope that, at least for the slowly-growing commercial Linux gaming market, they will also recognize the importance of keeping their version requirements modest. You can still design a very good-looking game with engrossing gameplay without the very latest GL version. And, while running the moderate risk of over-engineering, it is also possible to write a 3d engine that is backend-independent, like the original Unreal engine (proof of concept: there's a DirectX 10 renderer for Unreal and Deus Ex, more than a decade after the core engine was conceived).

                    Comment

                    Working...
                    X