Announcement

Collapse
No announcement yet.

Intel GLSL On-Disk Shader Cache Enabled By Default

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Intel GLSL On-Disk Shader Cache Enabled By Default

    Phoronix: Intel GLSL On-Disk Shader Cache Enabled By Default

    For Mesa 18.0 is the initial Intel shader cache support for archiving compiled GLSL shaders on-disk to speed up the load times of subsequent game loads and other benefits. For the Mesa 18.0 release the functionality isn't enabled by default but it will be for Mesa 18.1...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Are Intel going to still be making their own graphics in a few generations time? Will AMD parts replace it all, or is it only replacing their higher end Iris parts?

    Comment


    • #3
      Which one is more efficient?

      Steam's built in cache support for OGL and Vulkan games or Mesa's and Nvidia's blobs solution?

      Or exactly same?

      Comment


      • #4
        Right now if you want to use the Intel i965 Mesa driver's GLSL on-disk shader cache you need to have the MESA_GLSL_CACHE_DISABLE=0 environment variable set.
        I very often see these very stupid variable names. Just set the default to 0 instead and name it MESA_GLSL_CACHE or even MESA_GLSL_CACHE_ENABLED to avoid the double negation. Or maybe I'm just too stupid or too sloppy of a reader to get used to double negations everywhere, but even if I am I would appreciate if people could consider stupid people like me when designing their software.

        EDIT: I just realized that I'm stupid for not realizing that bash defaults its variables to empty string which is equal to false, so it makes sense after all. I'll have to redirect my hate towards bash instead. Please ignore my stupidity.
        Last edited by johanb; 21 February 2018, 09:44 AM. Reason: Added edit section and fixed proper quote

        Comment


        • #5
          Originally posted by Leopard View Post
          Which one is more efficient?

          Steam's built in cache support for OGL and Vulkan games or Mesa's and Nvidia's blobs solution?

          Or exactly same?
          Steam's cache stores the cache generated by Mesa and other drivers.

          Comment


          • #6
            Originally posted by Leopard View Post
            Which one is more efficient?

            Steam's built in cache support for OGL and Vulkan games or Mesa's and Nvidia's blobs solution?

            Or exactly same?
            Once you have a cache, they should serve the same purpose. However, the one provided by Steam could, in theory, download a precompiled cache before you ever need the shaders. (Can't confir that this actually happens as of now.) The one provided by the drivers is built as you go, and only helps once a shader is encountered for the second time, and beyond.

            Comment


            • #7
              Originally posted by FireBurn View Post
              Are Intel going to still be making their own graphics in a few generations time? Will AMD parts replace it all, or is it only replacing their higher end Iris parts?
              AMD does not replace anything. If you're referring to the Intel GPU and AMD GPU on one die announced recently, take a second look and notice that Intel iGPU is still there. There's a market for laptops with two GPUs - one low-power integrated and second high-performance dedicated - so they recognized it and squeezed everything together for easier integration by manufacturers, working together in this field against nVidia.

              Comment


              • #8
                Originally posted by johanb View Post
                EDIT: I just realized that I'm stupid for not realizing that bash defaults its variables to empty string which is equal to false, so it makes sense after all. I'll have to redirect my hate towards bash instead. Please ignore my stupidity.
                Bash doesn't matter I guess, mesa probably just calls getenv(3) and atoi(3), it makes little difference to code logic to differentiate here.

                Edit:
                Code:
                  bool
                  env_var_as_boolean(const char *var_name, bool default_value)
                  {
                     const char *str = getenv(var_name);
                     if (str == NULL)
                        return default_value;
                 
                     if (strcmp(str, "1") == 0 ||
                         strcasecmp(str, "true") == 0 ||
                         strcasecmp(str, "yes") == 0) {
                        return true;
                     } else if (strcmp(str, "0") == 0 ||
                                strcasecmp(str, "false") == 0 ||
                                strcasecmp(str, "no") == 0) {
                        return false;
                     } else {
                        return default_value;
                     }
                  }

                Comment


                • #9
                  Originally posted by johanb View Post

                  I very often see these very stupid variable names. Just set the default to 0 instead and name it MESA_GLSL_CACHE or even MESA_GLSL_CACHE_ENABLED to avoid the double negation. Or maybe I'm just too stupid or too sloppy of a reader to get used to double negations everywhere, but even if I am I would appreciate if people could consider stupid people like me when designing their software.
                  But that's the whole point. They want the default to have the cache on, and then the variable name will be MESA_GLSL_CACHE_DISABLE=1 in order to turn it off. The awkward double-negative was always a temporary issue until the default was correctly set. The alternative would be to change the variable name when the default changes, which would just break everyone's scripts and be annoying.

                  Comment


                  • #10
                    Originally posted by FireBurn View Post
                    Are Intel going to still be making their own graphics in a few generations time? Will AMD parts replace it all, or is it only replacing their higher end Iris parts?
                    AMD GPUs are much more expensive than normal Intel graphics even if they were embedded on the same die, while they are not really necessary for people that don't need a lot of 3D power.

                    Iris Pro on the other hand, might very well be more expensive or not as cost-effective. AMD does have a lead in GPUs after all.

                    Comment

                    Working...
                    X