Announcement

Collapse
No announcement yet.

Ivy Bridge Patches For OpenGL 4.0 In Mesa Updated

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Better OpenGL support means that I can test if my application works properly on more drivers/devices. And it doesn't really matter if I get 60 fps or 5 fps when I can still see if a scene is rendered properly. So yes, I will be glad if I get full GL 4.5 support for IVB
    You can basically check 5 frames per hour for pixel perfectness. If you put the wasted 45 minutes of that hour into work on a fast machine, and repeat that for some time, how long would it take to earn enough money to get one or two current machines?

    Comment


    • #12
      It is irrelevant only if linux is mainly a gaming platform.

      Comment


      • #13
        Originally posted by dungeon View Post

        4.1 should be fine too, to at least match hardware release using blob from 5+ years ago

        http://www.phoronix.com/scan.php?pag...n_hd6450&num=1

        But really i don't understand people wanting this for old and weak hardware as vast majority is slow for GL4 especially for point 3 and up... might only undertstand those with 6850/6870 to some extend
        It's a matter of having feature parity between the proprietary and the open source driver, besides that hardware even if old is still good enough to play some current games at low resolutions(on windows).

        Comment


        • #14
          Originally posted by dungeon View Post
          But really i don't understand people wanting this for old and weak hardware as vast majority is slow for GL4 especially for point 3 and up... might only undertstand those with 6850/6870 to some extend
          Future-proofing is the main reason I like to see these features get in. The current Ivy Bridge parts are likely to last decades and still be perfectly capable of running a desktop and even light 3D programs (There is stuff from like 2008 that can claim the same even now, btw).

          Even best Ivy iGPU always sucked as far as gaming goes so it's of course pointless, but who knows what I will need to run a good Linux desktop in ten years. Now we have quite a few DEs that don't work properly without OpenGL 2.0, it's not going to get better in the future.

          Comment


          • #15
            He, he, 15 years is a way too much, there is nothing future proof for old hardware... who uses GeForce 1 with 32MB VRAM to run good Linux Desktop today?

            Originally posted by starshipeleven View Post
            but who knows what I will need to run a good Linux desktop in ten years.
            Generally in 10 years what now is not common will be very common... lets imagine just then common but low 4K@60Hz resolution, that none of Terrascale nor that Ivy can't do in the first place Today, we can see that first GCN was released 5 years ago... time fly fast really

            Now i would really like to see some real world even lower point GL4 app example... maybe Trine 3 as it uses GL4.1 it seems, how it plays on Ivy even on lowest settings and what resolution
            Last edited by dungeon; 18 January 2017, 12:05 PM.

            Comment


            • #16
              Originally posted by dungeon View Post
              He, he, 15 years is a way too much, there is nothing future proof for old hardware... who uses GeForce 1 with 32MB VRAM to run good Linux Desktop today?
              Facts check (easy-mode bullshit detection system):
              0. I talked of 10 years, not 15
              1. Geforce 1 was released in 1999-2000, which is like 18-17 years ago, most trolls on this forum like say debianxfce or Master5000 were not yet born by then.
              2. Geforce 5 or FX on AGP came out in 2003, on PCIe in 2004, and they support OpenGL 2.1, so, can you still run modern DEs on a card that is around 13 years old? yes you can.

              Generally in 10 years what now is not common will be very common... lets imagine just then common but low 4K@60Hz resolution, that none of Terrascale nor that Ivy can't do in the first place
              4K is still facing a uphill battle and it's around (in commercial products) since at least 2009. I'm ready to bet that laptops will still have the same total crap HD-ready resolution even in this idyllic future of yours.

              Comment


              • #17
                Originally posted by starshipeleven View Post
                Facts check (easy-mode bullshit detection system):
                0. I talked of 10 years, not 15
                True, but you talked about Ivy which is 5 years old + those 10 = 15

                Originally posted by starshipeleven View Post
                1. Geforce 1 was released in 1999-2000, which is like 18-17 years ago, most trolls on this forum like say debianxfce or Master5000 were not yet born by then.
                I am not sure when those trolls were born On Geforce 1... yeah i can say Geforce 2 MX then, low end fake which was actually same as 1 just had TV-OUT

                2. Geforce 5 or FX on AGP came out in 2003, on PCIe in 2004, and they support OpenGL 2.1, so, can you still run modern DEs on a card that is around 13 years old? yes you can.
                FX does not support GL2 in hardware, it was a faked during vendor extensions wars, DX8.x times ... just like RPi's GPU driver vc4 fake GL2 but nope it does not support it in hardware

                Currently i am reading this Michael's newest article about vivante in mirror faking occlusion extension

                https://www.phoronix.com/forums/foru...2-run-ioquake3

                And i have in mind opensource zaelot named duby229, who like to talk how mesa drivers are most proper drivers

                These vendors does so much bulshits in history, then said and claiming something else, and mesa also - that i have no intention to defend any of them
                Last edited by dungeon; 18 January 2017, 06:04 PM.

                Comment


                • #18
                  Originally posted by dungeon View Post
                  True, but you talked about Ivy which is 5 years old + those 10 = 15
                  I didn't specify 10 years from now.

                  FX does not support GL2 in hardware, it was a faked during vendor extensions wars, DX8.x times ... just like RPi's GPU driver vc4 fake GL2 but nope it does not support it in hardware
                  So does the GMA3150, but that does not stop it from running a DE that needs OpenGL 2.0. Sure on an atom it's noticeable that some actions load 100% the processor, but on a Ivy bridge processor, much less.

                  Comment


                  • #19
                    Originally posted by starshipeleven View Post

                    So does the GMA3150, but that does not stop it from running a DE that needs OpenGL 2.0.
                    Exactly that is faked too, you see it is practice above properness... rule should be that features which are not fully supported by hardware and are part of particular GL version shouldn't be advertised, futher version also can't be advertised.

                    They should let user fake something if he wants, but not make fake default
                    Last edited by dungeon; 19 January 2017, 05:32 AM.

                    Comment


                    • #20
                      Originally posted by dungeon View Post
                      They should let user fake something if he wants, but not make fake default
                      In general you are right. But this days, when compositors require OGL2/GLES as minimum level, I think faking OGL2 or GLES2 for such old hardware is not such a big deal. It's allow people to run modern software after all.

                      Comment

                      Working...
                      X