Announcement

Collapse
No announcement yet.

We're Now Up To OpenGL 4.1; Brings New Features

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • We're Now Up To OpenGL 4.1; Brings New Features

    Phoronix: We're Now Up To OpenGL 4.1; Brings New Features

    The Khronos Group came out in mid-March to release the OpenGL 4.0 specification along with OpenGL 3.3 (to bring as many OGL4 features back to OGL3 as possible for older hardware that doesn't support OpenGL 4.0), but today from SIGGRAPH in Los Angeles they have rolled out OpenGL 4.1. The Khronos Group has now put out six ratified versions of OpenGL in less than two years and the 4.1 release adds more graphical goodies to this industry standard. OpenGL 4.1 is also joined by version 4.10 of GLSL, the GL Shading Language...

    http://www.phoronix.com/vr.php?view=ODQ1MA

  • #2
    I normally don't mind when a standards body works efficiently and doesn't get tied up for years in politics, but this is just ridiculous. It's clear that the major players are trying to up the ante on getting people to buy newer hardware and encouraging software developers to write to these new APIs.

    With the tepid attention given to FOSS graphics drivers across the board, this is bad news for open source graphics. What will we see next week? GL 5.0?

    I wish the ARB would slow down for a couple years (maybe 5 - 8 years, because we really have to give all 3 of those open source guys a chance) and let FOSS catch up. But alas, new incompatible ASICs will be developed, and the GL API will continue to get hairier and hairier. Things were fine for a few years when 2.x was the standard, but as soon as 3.0 was released, new APIs have been chunked out almost as fast as the 50-man proprietary driver teams at ATI and Nvidia can support them.

    I love innovation as much as the next guy, but this whole thing smells fishy. I doubt the ARB is out to get open source, but they are definitely out to get more sales in current-generation GPUs, and they are definitely hoping that app developers will require these new APIs in order to drive sales. The only thing we can do as application developers is to hold off on requiring GL 3.0 or later in our applications, thereby telling the ARB where to shove it, and supporting people who run FOSS drivers. That goes for the proprietary games as well as open source stuff, not to mention the visualization crowd.

    Maybe someday, or in a parallel universe where FOSS gets more attention, the rate of development on Mesa would be able to keep up the pace with the ARB. But I don't think that's remotely close to being here or now.

    Comment


    • #3
      Yes! We should stop production of new GPU for 8 years! So cheapskates can keep using their GPU for another 8 years without having to starve to get the cash to buy new ones.

      Comment


      • #4
        I don't think I understand the how you made the logical jump that recent releases from the Khronos Group are done so to influence the adoption of new hardware.

        Right now OpenGL versions are classed based the the capabilities of hardware. If next week a new class of consumer grade GPUs were released that contained some new sort of functionality(ie GPUs who's hardware is optimized for ray-tracing instead rasterization could be a option one day), then a release of OpenGL 5 would be appropriate.

        At the same time, I'm sure the Khronos Group would work to maintain as much common functionality between OpenGL 3,4, and 5. The fact that Khronos Group maintains new functionality in older classes of hardware(within the reasonable limits of the older hardware's capabilities), means to me that Khronos Group has no interest pushing the adoption of new generations of hardware.

        As for slowing down the standardization process, if the Khronos were to slow the standardization process, the only result would be the proliferation and adoption of vendor specific extensions, much like during the lifespan of OpenGL 2.x. While vendor specific extensions are one of the strengths of OpenGL, application vendors should not need to implement those extensions to provide reasonably competitive functionality.

        Comment


        • #5
          Internet Explorer : Web standards :: Mesa : OpenGL

          Comment


          • #6
            Who cares if "we're up to 4.1" if no one uses the stuff anyway.

            Comment


            • #7
              Originally posted by RealNC View Post
              Who cares if "we're up to 4.1" if no one uses the stuff anyway.
              You're right. Who cares if HTML5 and <video> was designed when nobody used it anyway? Hell, why he heck did Linus write Linux when nobody was already using it? I'm not sure why they invented spreadsheets when clearly nobody at the time was using a spreadsheet. For that matter, what the hell was Henry Ford thinking when he invented automobiles? There weren't any drivers yet! And I can't believe that Edison wasted everyone's time by discovering electricity many years before we had electric outlets and appliances to use it with.

              Logic seems to be coming in shorter supplies on Phoronix forum posts these days.

              Comment


              • #8
                Originally posted by elanthis View Post
                You're right. Who cares if HTML5 and <video> was designed when nobody used it anyway? Hell, why he heck did Linus write Linux when nobody was already using it? I'm not sure why they invented spreadsheets when clearly nobody at the time was using a spreadsheet. For that matter, what the hell was Henry Ford thinking when he invented automobiles? There weren't any drivers yet! And I can't believe that Edison wasted everyone's time by discovering electricity many years before we had electric outlets and appliances to use it with.

                Logic seems to be coming in shorter supplies on Phoronix forum posts these days.
                I understand your point in this rant, but you may want to double check a few of your scenarios.

                Henry Ford didn't invent the automobile. He implemented an assembly line manufacturing process which helped make them more affordable.

                Thomas Edison didn't discover electricity, but he did invent the light bulb.

                In the case of OpenGL 4.x, they are there to bring OpenGL's capabilities up to the same level as DirectX on Windows. People are already using the features that OpenGL 4.x introduce, but they haven't been able to do it in OpenGL until now (unless they used non-standard extensions).

                Comment


                • #9
                  I'm happy to see all of these OpenGL enhancements. Maybe the OpenGL vs. DirectX war will heat up again. I'd love to see OpenGL and similar graphics slowly marginalize DirectX.

                  Comment


                  • #10
                    Originally posted by Milyardo View Post
                    I don't think I understand the how you made the logical jump that recent releases from the Khronos Group are done so to influence the adoption of new hardware.
                    (1) Major ASIC designers (ATI, Nvidia, Intel) want to increase sales. (Axiom)
                    (2) When customers purchase new hardware, it increases sales. (Axiom)
                    (3) Major ASIC designers have a significant role in driving the development and ratification of new OpenGL standards. (Source)
                    (4) When a new major version number of OpenGL is released, it is indicative of a hardware compatibility gap: the new major version excludes one or more previous generations of hardware by all the major ASIC designers. That is: if all the extensions are to be implemented in hardware, there exists some ASIC A released prior to ASIC B where OpenGL version N works with ASIC B in fully hardware-accelerated mode, but ASIC A can only provide full hardware-accelerated support for OpenGL version N-1.x (Source)
                    (5) If an application developer uses OpenGL version N in their application, then any hardware which only supports version N-1, or N-2, etc. of OpenGL can not run the application (or can only do so in software mode, which is not suitable for real-time graphics). From (4)
                    (6) Successful, useful, or entertaining applications are demanded by users. (Axiom)
                    (7) If a particular successful, useful, or entertaining application P requires OpenGL version N, then any hardware which only supports version N-1, or N-2, etc. of OpenGL can not run the application (or can only do so in software mode, which is not suitable for real-time graphics). From (5) and (6)
                    (8) P is demanded by users. From (6) and (7)
                    (9) Hardware which supports OpenGL version N is required for P to run as intended. From (5) - (8)
                    (10) Hardware which supports OpenGL version N is demanded by users who wish to run P. From (5) - (9)
                    (11) For each customer who has demand for P, there is a nonzero probability P(x) that said customer will purchase hardware to run P. From probability theory and (10)
                    (12) P(x) increases for each instantiation of (10) with different P, if the demand applies to said customer. (This is hard to prove except empirically, but the qualification "if the demand applies to said customer" makes it nearly irrefutable via intuition: if someone wants to run numerous apps requiring OpenGL 4.0, they will prioritize their purchase of supported hardware more than they would if they wanted to run fewer or zero apps requiring OpenGL 4.0.)
                    (13) The result in (12) generates sales for major ASIC designers. Conclusion from (1) - (3) and (12)

                    Ratifying new OpenGL versions is the first software step in feeding this cycle, although it is usually also driven by hardware features.

                    Originally posted by Milyardo View Post
                    Right now OpenGL versions are classed based the the capabilities of hardware. If next week a new class of consumer grade GPUs were released that contained some new sort of functionality(ie GPUs who's hardware is optimized for ray-tracing instead rasterization could be a option one day), then a release of OpenGL 5 would be appropriate.
                    I think we're in agreement here.

                    Originally posted by Milyardo View Post
                    At the same time, I'm sure the Khronos Group would work to maintain as much common functionality between OpenGL 3,4, and 5. The fact that Khronos Group maintains new functionality in older classes of hardware(within the reasonable limits of the older hardware's capabilities), means to me that Khronos Group has no interest pushing the adoption of new generations of hardware.
                    My interpretation of the point releases to the old APIs was that they attempt to introduce software-level API improvements (better design, new extensions that can be implemented more efficiently, etc) without really pushing features. After all, if something requires hardware tessellation and you don't have a hardware tessellator, you can write any API you want but it won't magically make a hardware tessellator appear on your GPU. The point releases are gravy for developers, and maybe help them write software that makes better use of existing hardware, but that doesn't get to the heart of the problem.

                    The issue for FOSS graphics is that the development moves so glacially that even a minor point release with relatively few enhancements will take Mesa the better part of a year (or more) to support, just for the "works without errors" step. Then, if you want it to work fast, wait another year or two -- all without upgrading your hardware, remember, because the current generation of hardware isn't supported until after the next generation is out.

                    Originally posted by Milyardo View Post
                    As for slowing down the standardization process, if the Khronos were to slow the standardization process, the only result would be the proliferation and adoption of vendor specific extensions, much like during the lifespan of OpenGL 2.x. While vendor specific extensions are one of the strengths of OpenGL, application vendors should not need to implement those extensions to provide reasonably competitive functionality.
                    But if the hardware features were to slow down, so would the standardization process have to as well, I think. They could continue to release point releases -- say, 4.2, 4.3, 4.4, etc. -- but it would all be within the same generation of hardware.

                    Of course, I don't expect the hardware features to stop coming until we hit the limits of physics and have to re-think computer hardware entirely. So I am not saying that it is a practically feasible solution to slow down the ratification process or the hardware march, because the vendors depend on that revenue, and HPC / extreme gaming customers -- two vocal groups who push for these features -- care more about features than whether the driver's open source or not.

                    So in "the FOSS world", the only real solution we have is that which I posited in my first post: slow down our use of the newer versions. This is completely within the grasp of the FOSS community, unlike the hardware march. For instance, it would be a large mistake for the guys working on Clutter/Mutter/GNOME 3.0 to start requiring OpenGL 3.x or 4.x to start a GNOME desktop. They intentionally keep the API requirements fairly modest so that the FOSS drivers can run the software. They set a precedent that other app developers should emulate. Which is why I will do the same in my own software.

                    I just hope that, at least for the slowly-growing commercial Linux gaming market, they will also recognize the importance of keeping their version requirements modest. You can still design a very good-looking game with engrossing gameplay without the very latest GL version. And, while running the moderate risk of over-engineering, it is also possible to write a 3d engine that is backend-independent, like the original Unreal engine (proof of concept: there's a DirectX 10 renderer for Unreal and Deus Ex, more than a decade after the core engine was conceived).

                    Comment


                    • #11
                      i agree that apps have to keep a moderated adoption rate at least until mesa do some catch up but khronos cant just slow down cuz that will efectively kill OpenGL.

                      understand that most of those 3D devs want an api able to squueze the last bits of feature that the highend cards can provide and for that they need an api that follow that rythm.

                      even if they would never use those new feature or even if is a software that not necessarily will run on that hardware in a near future.

                      everything here is as simple as DX(put your version here) have X,Y,Z function and hey OpenGL not = OpenGL is obsolete so it does not worth it or is not as future-proof enough as DX == OpenGL dead(well linux and mac will always keep it but you get the idea).

                      now i wanna clear something here as someone was mad cuz if something require X GL version cant run on X-y GL version, Dude OpenGL is not DirectX, in DirectX you have to recode every release(or at best partially) cuz DX is some sort of weird library paste and copy that get rewrited every now and then.

                      remember opengl is extension based aka GL dont have releases, it has revisions, aka the amount X of new extensions is packed as an incremental upgrade named for conveniece OpenGL X.y.

                      aka in most cases depending on the vendor libgl implementation i can make a software in GL2.1 and it will run perfectly fine on GL4.1 hardware and software or backwards just been careful of making those extension optional so i can adjust the visual quality for every generation (example errrmm unigine demos).

                      so yes tessalation in opengl is not a core function that need a massive rewrite, is only an optional extension that can be used if the hardware is present and you have the rigth revision that contains that specific function in you libgl library and that apply to almost anything in opengl with the exception of pbuffers wich should be avoided as possible cuz those are slooooow.

                      the only thing you may wanna chekout every "release" of opengl is the GsGL changes cuz some of those could make your life easier and your software more eficient but thats much it

                      so it matters if GL 5. 6. 7 get public next week? NO, should mesa bother in reach opengl X? NO for current generation of hardware, would be nice though for those who invested in the latest GPU cuz those new extension are thinked to optimize an specific workload in that new hardware, opengl 2.1 is enough for now? pretty much yes, i would love 3.4 cuz my hardware have those hardware bit present so i can get a bit more perfoamnce or visual quality but yes is enough for now. should i as dev think in use something more thatn GL 2.1? yes, but be careful when you use the extensions, with good dev skills and some care you app can scale from dx9/2.1 like hardware to dx11/4.1 hardware without any major issue(aka ati crappy gl lib)

                      Comment


                      • #12
                        Originally posted by allquixotic View Post
                        [...]So in "the FOSS world", the only real solution we have is [...]
                        no there is another solution ---> drop OpenGL!

                        and write an ray-tracing game engine based on OpenCL!

                        and use CPU/GPU for the exact same openCL-code!


                        thats kill this fucking DirectX/openGL shit ;-)

                        Blender Goes that way nice 64core ray-tracing engine ;-)

                        Comment


                        • #13
                          fragging edit time

                          there an exception to that what i wroted

                          openGL 1.x dont have shader cuz that hardware didint have shaders
                          openGL 2.0 have shader support but is very early, ok you can write a shader for 2.0 2.1 3.x 4.x so you can support dx8 class hardware too but not sure if that worth it these days.

                          theorically (especially since gallium) you can use gl 2.1 in dx8 or prior class hardware but shader have to fallback to a cpu code (gallium llvm) so dont expect those card to scream FPS but it can give a bit more life to that older hardware(already implemente in r300 gallium driver for laptop x1xxx cards that dont have shaders at all in hardware)

                          in fact is theorically possible to use shaders 4.1 in that hardware too (when gallium llvm get 4.1 support ofc), in fact you can do the same for opengl too (the driver just fall back to llvm in those extension not present in the hardware), so yes opengl is that flexible

                          Comment


                          • #14
                            Originally posted by Qaridarium View Post
                            no there is another solution ---> drop OpenGL!

                            and write an ray-tracing game engine based on OpenCL!

                            and use CPU/GPU for the exact same openCL-code!


                            thats kill this fucking DirectX/openGL shit ;-)

                            Blender Goes that way nice 64core ray-tracing engine ;-)
                            well nobody will ever make a game engine in opencl to begin with and opencl is not meant for that job either(for now, is too much work to be realistic), now opencl can be used for raytracing an many other things where is very efficient like you said.

                            so you cant get ride of opengl that easily, i wont bother is directx cease to exist though

                            as opengl is very efficient in certain jobs opencl is very strong in another area wich not necessarilly collide with opengl and backwards, so is not like opencl was meant as a replacement for opengl, it was meant more like a programatic language to take advantage of those massive fpu/parallel abilities of current gpu

                            Comment


                            • #15
                              It sucks that Mesa will be another version behind, but keep in mind that OpenGL 4.x will only run on DirectX 11 hardware. So it certainly won't be required anytime soon as even most Windows games still support DirectX 9.

                              Comment

                              Working...
                              X