Announcement

Collapse
No announcement yet.

Intel's Mesa Vec4 Code Now Unconditionally Uses NIR

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Intel's Mesa Vec4 Code Now Unconditionally Uses NIR

    Phoronix: Intel's Mesa Vec4 Code Now Unconditionally Uses NIR

    As of a change yesterday to Intel's i965 Mesa driver, the Vec4 back-end is unconditionally using NIR rather than GLSL IR with the option being removed...

    http://www.phoronix.com/scan.php?pag...-NIR-Permanent

  • #2
    What is vec4 ??

    Comment


    • #3
      Originally posted by orzel View Post
      What is vec4 ??
      4D floating-point vectors , I think.

      Comment


      • #4
        Originally posted by orzel View Post
        What is vec4 ??
        Intel GPU HW has two modes it can operate in, "scalar" and "vec4". Depending on the generation, different shader stages require different modes to be run, and each mode has its own instruction set I think, so you need separate code generation paths.

        Comment


        • #5
          Now the interesting question is: How much of performance improvement does this change bring?

          Comment


          • #6
            Originally posted by jf33 View Post
            Now the interesting question is: How much of performance improvement does this change bring?
            http://cgit.freedesktop.org/mesa/mes...4c1c7840c97339 for basic info.

            Comment


            • #7
              Originally posted by Serafean View Post
              That's awesome, thanks for the link.

              Comment


              • #8
                Originally posted by jf33 View Post
                Now the interesting question is: How much of performance improvement does this change bring?
                It only affects vertex shaders on Haswell and prior hardware. More recent hardware is scalar only.

                The improvements for vertex shaders are pretty nice, however it's important to note that vertex shaders generally don't matter very much. It's the fragment shaders that generally limit game performance. So I wouldn't expect to actually notice much of a difference in real-world use.

                Still, every little bit helps and it's nice to just have everything running through the same optimizing system rather than maintaining a separate codebase that gets tested much less often and thus is more likely to have hard to track down bugs in it.

                Comment

                Working...
                X