A Lot Of The OpenGL Shader Cache Code Has Landed In Mesa

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • phoronix
    Administrator
    • Jan 2007
    • 67373

    A Lot Of The OpenGL Shader Cache Code Has Landed In Mesa

    Phoronix: A Lot Of The OpenGL Shader Cache Code Has Landed In Mesa

    Timothy Arceri, who is now working for Valve (on the open-source AMD driver stack after leaving Collabora), has landed significant portions of his work built upon others for providing an on-disk shader cache within Mesa...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite
  • theghost
    Senior Member
    • Sep 2013
    • 346

    #2
    Finally. So Mesa 17.1 we might have a working shader cache.

    Comment

    • uid313
      Senior Member
      • Dec 2011
      • 6921

      #3
      Does Microsoft Windows use a shader cache?

      Will the cache be located in /tmp/ or ~/.cache/ ?

      Comment

      • AsuMagic
        Senior Member
        • Apr 2016
        • 573

        #4
        Originally posted by uid313 View Post
        Does Microsoft Windows use a shader cache?

        Will the cache be located in /tmp/ or ~/.cache/ ?
        the nvidia driver does it independently and I remember having seen a directory mentioning shader cache. (plus it's an option in the driver)
        dunno about AMD, but I'd assume they have one too, but I doubt this is done OS-side, rather driver-side.

        Comment

        • Nightbane112
          Junior Member
          • Aug 2015
          • 9

          #5
          Originally posted by AsuMagic View Post

          the nvidia driver does it independently and I remember having seen a directory mentioning shader cache. (plus it's an option in the driver)
          Can confirm on my laptop, as I have a ~/.nv directory containing ComputeCache and GLCache subdirectories created when using the nvidia driver

          Comment

          • AsuMagic
            Senior Member
            • Apr 2016
            • 573

            #6
            Originally posted by Nightbane112 View Post

            Can confirm on my laptop, as I have a ~/.nv directory containing ComputeCache and GLCache subdirectories created when using the nvidia driver
            i was referring to windows, but yeah their proprietary driver supports it on linux too

            Comment

            • Azrael5
              Senior Member
              • Jul 2012
              • 1954

              #7
              it should be useful on chrome/chromium browsers.

              Comment

              • L_A_G
                Senior Member
                • Oct 2015
                • 1612

                #8
                Originally posted by AsuMagic View Post
                i was referring to windows, but yeah their proprietary driver supports it on linux too
                Why wouldn't it when most of the code base is shared between them?
                "Why should I want to make anything up? Life's bad enough as it is without wanting to invent any more of it."

                Comment

                • VikingGe
                  Senior Member
                  • Nov 2015
                  • 371

                  #9
                  Does anyone know if this will be wired up to the GL_ARB_get_program_binary extension so that we will (finally) get a supported binary format? Does any game actually use that extension?

                  I'm not a huge fan of full driver-side implementations of the shader cache anyway, but since Dx11 AFAIK doesn't even allow a client-side implementation, things just turned out that way and it certainly is better than nothing.

                  Comment

                  • cj.wijtmans
                    Senior Member
                    • Mar 2016
                    • 1404

                    #10
                    Originally posted by VikingGe View Post
                    Does anyone know if this will be wired up to the GL_ARB_get_program_binary extension so that we will (finally) get a supported binary format? Does any game actually use that extension?

                    I'm not a huge fan of full driver-side implementations of the shader cache anyway, but since Dx11 AFAIK doesn't even allow a client-side implementation, things just turned out that way and it certainly is better than nothing.
                    with opengl you can do it yourself.

                    Comment

                    Working...
                    X