Announcement

Collapse
No announcement yet.

OpenGL 4.0 Patches For Intel Ivy Bridge Revised

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Though I don't agree with cRaZy-bisCuiT, I do understand where he's coming from. Ivy Bridge's GPU wasn't very powerful, so there are very few GL 4.0 titles that would be considered playable on it (but that doesn't mean GL 3.x or indie titles aren't playable on it). However, I see no harm in adding the support. Even when I don't use the hardware, for me personally, there's a nice cozy feeling knowing something is just 1 step closer to feature completion. Considering Intel GPUs aren't meant for gaming and are very acceptable for generic desktop use, performance optimizations (in my opinion) should be a lower priority than feature completion.

    As for those who complain about Haswell and Ivy Bridge not getting features when Broadwell, Skylake, etc do, I also understand that. Haswell isn't that old, and it's discouraging when it is already "going ignored" when there is still plenty that needs to get done. Intel has a knack for sticking to a schedule and if something is incomplete by the time a new product is to be released, it is often left behind. But, their Linux team doesn't seem to strictly follow what the rest of the company does, which is nice.
    Last edited by schmidtbag; 14 February 2017, 02:43 PM.

    Comment


    • #12
      It's also fun sometimes to try and launch a recent AAA game on an old IGP; while Haswell doesn't support DirectX11 well enough to start Deus Ex: Mankind Divided on Windows, I can happily start it and actually try to walk through the slideshow at Low details @1280x720 on Linux!

      Comment


      • #13
        As someone still on a Bay Trail* laptop (and not upgrading until I get something ARM-based to replace it with): I greatly appreciate any work still done for it, whether OpenGL or Vulkan.


        *Bay Trail's based on Ivy Bridge, for those wondering.

        Comment


        • #14
          Does Ivy Bridge support OpenGL ES 3.0 and/or WebGL 2.0?

          EDIT: Nevermind, answered my own question: http://www.phoronix.com/scan.php?pag...tem&px=MTMwMDg
          Last edited by DanL; 14 February 2017, 12:39 PM.

          Comment


          • #15
            Originally posted by cRaZy-bisCuiT View Post
            Still I wonder why so much work is put in old Intel GPUs. I don't think they could ever play any recent Games. The only exception might be Intel Iris Pro.
            I'm wondering why people buy slower than GTX 1080. I mean come on, if you're so poor shouldn't you just hang yourself? It's GTX 1080 or GTFO. I understand 4-way SLI Titan X might be out of the reach for many. There are laptops that only have room for one GPU. But less than GTX 1080, jeez..

            Comment


            • #16
              Originally posted by kirgahn View Post

              Hey, I'm on an Ivy Bridge notebook and those patches enable running unity games without downgrading to OpenGL 2.1 (--force-opengl or something). I can assure you that I run games with this notebook, even of it's generally indie games with low system requirements. I may be able to run Motorsport Manager with the 3D view at minimum details, that would be really appreciated.
              Not to mention that unity dropped GL 2.1 in the 5.5 version and many games have already upgraded to it.

              Comment


              • #17
                Originally posted by cRaZy-bisCuiT View Post
                I wonder why so much work is put in old Intel GPUs. I don't think they could ever play any recent Games. The only exception might be Intel Iris Pro.
                I don't know the business reasons why Intel funded this, but I'll tell you why I like this as a developer. When I'm maintaining an OpenGL application for work, the OpenGL version we target is dictated based upon a reasonable lowest common denominator across FreeBSD, Linux, MacOS X and Windows. Until last year that was 2.1; once Mesa supported it we upgraded to 3.3 Core Profile. We can't upgrade to 4 until it's supported by Mesa on common hardware, and older Intel GPUs definitely count here. By these improvements, they move the lowest common denominator to a higher OpenGL version, and for that I'm very happy.

                The next sticking point is going to be MacOS X, since they've been stuck at 4.1 for years now. We might have to drop it as a supported platform at that point if they don't pull their finger out.

                Comment


                • #18
                  Originally posted by mattst88 View Post
                  It amazes me that no matter what the news, there is someone who thinks it's stupid. [...] People think it's a waste of time.
                  Originally posted by starshipeleven View Post
                  [...] someone will complain.
                  [...] cRaZy-bisCuiT is posting bullshit, these GPUs are still very damn common and still in use and will likely remain in use for quite a bit of time. If we were talking of VIA's GPUs on the other hand...
                  Please read more carefully before reacting so strongly, cRaZy-bisCuiT, didn't say that it was stupid, neither that it was a waste of time, neither complained. Look:

                  Originally posted by cRaZy-bisCuiT View Post
                  Still I wonder why so much work is put in old Intel GPUs. I don't think they could ever play any recent Games. The only exception might be Intel Iris Pro.
                  It wasn't bullshit, it's legitimate to question why a company would fund work on previous gens hardware. After all we are often seeing the contrary with hardware being forgotten and getting obsolete quicker due to lack of software support.

                  I was wondering the same and the answers were very informative, so I'm happy that this questions was asked.

                  Comment


                  • #19
                    Originally posted by mattst88 View Post
                    It amazes me that no matter what the news, there is someone who thinks it's stupid. [...] People think it's a waste of time.
                    Originally posted by starshipeleven View Post
                    [...] someone will complain.
                    [...] cRaZy-bisCuiT is posting bullshit, these GPUs are still very damn common and still in use and will likely remain in use for quite a bit of time. If we were talking of VIA's GPUs on the other hand...
                    Please read more carefully before reacting so strongly, cRaZy-bisCuiT, didn't say that it was stupid, neither that it was a waste of time, neither complained. Look:

                    Originally posted by cRaZy-bisCuiT View Post
                    Still I wonder why so much work is put in old Intel GPUs. I don't think they could ever play any recent Games. The only exception might be Intel Iris Pro.
                    It wasn't bullshit, it's legitimate to question why a company would fund work on previous gens hardware. After all we are often seeing the contrary with hardware being forgotten and getting obsolete quicker due to lack of software support.

                    I was wondering the same thing [1] and the answers were very informative so I'm happy that someone asked this question.

                    [1] And also a friend of mine when telling them that there was WIP on GL4.0 for Ivy Bridge.

                    Comment


                    • #20
                      Cool, Ivy Bridge is what I have on this very PC right now. It's not what's driving the display, but still, this is nice.

                      Comment

                      Working...
                      X