Announcement

Collapse
No announcement yet.

Igalia Posts Intel vertex_attrib_64bit Mesa Driver Patches, Close To OpenGL 4.1+

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Mystro256 View Post
    Is there a way to override it without recompiling? Perhaps an environment variable or something?


    MESA_GL_VERSION_OVERRIDE
    MESA_GLSL_VERSION_OVERRIDE
    Test signature

    Comment


    • #12
      Originally posted by davidbepo View Post

      true but some games check version rather than extensions so an override is necessary also the missing openfgl 4.3 extension isnt almost needed
      You can always fake the version though. Anyways, since opengl3 each extension should be queried and imported on runtime to make sure it exists and querying by profile is deprecated. Not everybody got the memo tho and it's easier to say this game needs "opengl 4.2" than listing 40 required extensions for the game.

      Comment


      • #13
        Shit! I have Radeon 6770 BARTS without fp64 and I have Intel Haswell without fp64 again... ((
        Can mesa show OpenGL version without fp64 functional (auto and without OVERRIDE)?
        Like FAKE_FP64_SUPPORT=1 ?

        Comment


        • #14
          There a way of overriding individual extensions, but I can't remember how

          I'm still waiting on computer shaders on my Skylake laptop too, got them on my Tonga discreet card first

          Comment


          • #15
            Originally posted by davidbepo View Post

            true but some games check version rather than extensions so an override is necessary also the missing openfgl 4.3 extension isnt almost needed
            Yeah, let's invest dozens of man-hours into writing code that is not actually executed, just so a byte in a string somewhere changes its value.

            Bottom line is, if an application asks for 4.2 even though it doesn't require it, that's a bug. And we've always had to work around bugs, eg. when a game declares GLSL exts mid-shader, a DRI-conf entry is needed to hack around it. This is no different; you just have to work around the buggy app.

            If a game said on its package it needs 8GB RAM to run, but everyone on the internet confirms that it really needs less than 2GB, you wouldn't go into a store and buy more RAM just so the game sees a higher value; you'd instead tell it a fake RAM size.

            Comment


            • #16
              Originally posted by Ancurio View Post

              Yeah, let's invest dozens of man-hours into writing code that is not actually executed, just so a byte in a string somewhere changes its value.

              Bottom line is, if an application asks for 4.2 even though it doesn't require it, that's a bug. And we've always had to work around bugs, eg. when a game declares GLSL exts mid-shader, a DRI-conf entry is needed to hack around it. This is no different; you just have to work around the buggy app.

              If a game said on its package it needs 8GB RAM to run, but everyone on the internet confirms that it really needs less than 2GB, you wouldn't go into a store and buy more RAM just so the game sees a higher value; you'd instead tell it a fake RAM size.
              Yes, but each version of OpenGL brings a lot of functions, if one is missing the game cannot start or will crash...

              So basically 2 choices:

              - the dev names the functions used and the system check them 1 by 1 (hundreds ? thousands ?)
              or
              - you put version numbers and reject the system with inferior version

              I do not really like the 1st option...

              Comment


              • #17
                Originally posted by FireBurn View Post
                There a way of overriding individual extensions, but I can't remember how

                I'm still waiting on computer shaders on my Skylake laptop too, got them on my Tonga discreet card first
                I posted full example to force it on a while back but never got it to enable a higher GL version

                Comment


                • #18
                  Thanks!!

                  Comment


                  • #19
                    Originally posted by Passso View Post
                    So basically 2 choices:

                    - the dev names the functions used and the system check them 1 by 1 (hundreds ? thousands ?)
                    or
                    - you put version numbers and reject the system with inferior version

                    I do not really like the 1st option...
                    There's a third option which AFAIK is most commonly used - find the highest GL level where you need most of the functions, "require" that GL version, and then check individually for the (typically small) number of functions required on top of that GL version.
                    Test signature

                    Comment


                    • #20
                      It's not used because you read the same specs for lots of ports done by the same ppl. As it often runs a bit slow with Intel nobody seems to test it. At least for Broadwell/Skylake this will be no problem except speed in a few months. But it is a bit disappointing these days that Haswell is already too outdated, Vulkan is in better shape for Broadwell+ too.

                      Comment

                      Working...
                      X