Announcement

Collapse
No announcement yet.

LLVMpipe With Intel's GLSL2 Compiler

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • LLVMpipe With Intel's GLSL2 Compiler

    Phoronix: LLVMpipe With Intel's GLSL2 Compiler

    Last month we tested out Intel's new GLSL compiler for Mesa when running the ATI Radeon classic Mesa and Gallium3D drivers to see how this GL Shading Language compiler designed by Intel employee's for their hardware and open-source driver work for the other open-source drivers, since all of the Mesa drivers will be affected once this "GLSL2" compiler is merged into the Mesa code-base by month's end. The experience using Intel's new shader compiler with the ATI Radeon graphics driver worked fine except for Warsow where serious regressions were visible, but in the other games that are capable of running off Mesa, the experience was fine. What we have been curious to test since then with this new OpenGL shader compiler has been the LLVMpipe driver -- a Gallium3D driver we have been very excited about as it finally provides a better software rasterizer for Linux by leveraging Gallium3D and the Low-Level Virtual Machine (LLVM) compiler infrastructure for accelerating the Mesa state tracker atop a modern multi-core CPU that supports SSE4 instructions. We have now finished running tests of the Intel's GLSL2 branch with the most recent LLVMpipe driver code.

    http://www.phoronix.com/vr.php?view=15197

  • #2
    Actually, regarding the texture-from-pixmap glx extension, that is NOT NECESSARILY REQUIRED for compiz to operate! The 0.9 branch of compiz uses the "copytex" plugin in its place.

    http://forum.compiz.org/viewtopic.ph...t=10402#p75616

    In other words, you can build the 0.9 branch and test again... you might get compiz-on-cpu.

    Comment


    • #3
      Originally posted by droidhacker View Post
      Actually, regarding the texture-from-pixmap glx extension, that is NOT NECESSARILY REQUIRED for compiz to operate! The 0.9 branch of compiz uses the "copytex" plugin in its place.

      http://forum.compiz.org/viewtopic.ph...t=10402#p75616

      In other words, you can build the 0.9 branch and test again... you might get compiz-on-cpu.
      Oh yeah, didn't think about Compiz 0.9 yet. Will give that a shot.
      Michael Larabel
      http://www.michaellarabel.com/

      Comment


      • #4
        so its basically up to 30% performance loss but 1-5% less energy used? heureka!
        is it normalized for the same amount of frames? so virtually Frames/Joule or Joule/Frame?

        Comment


        • #5
          You realize Open Arena doesn't even use GLSL right? Or I assume it doesn't, as it uses a fairly stock ioQuake3 engine, and ioQuake3 doesn't use GLSL without being patched since last I checked. Which was a few minutes ago. Also, it doesn't *look* like it uses GLSL.

          In fact, do any of those games use GLSL? I'm not sure that any of them do. (Anyone should feel free to correct me on this.)

          Maybe this compiler is also involved in the other types of shaders or it has some kind of "idle overhead," that's the only way I can explain the performance differences measured in other games.

          Comment


          • #6
            Originally posted by MaxToTheMax View Post
            You realize Open Arena doesn't even use GLSL right? Or I assume it doesn't, as it uses a fairly stock ioQuake3 engine, and ioQuake3 doesn't use GLSL without being patched since last I checked. Which was a few minutes ago. Also, it doesn't *look* like it uses GLSL.
            I grabbed the timedemo and config PTS uses and ran it using MESA_GLSL=log which dumps all shaders used to files, and ended up with nothing - so OpenArena doesn't seem to use shaders...

            Regarding the breakage in Warsow, I still can't find a bug filed about this, that's kind of a disappointment.

            Comment


            • #7
              I can't seem to find any GLSL usage in Tremulous either?

              Comment


              • #8
                This bug will break shaders in gallium - probably a lot of them.

                https://bugs.freedesktop.org/show_bug.cgi?id=29490

                While the article is interesting, the code is from 10 days ago and that might already be out of date. Intel has been doing tons of work fixing bugs and optimizing GLSL2 in order to get it ready for merging this Friday.

                Comment


                • #9
                  low cpu usage

                  Why wouldn't you want the cpu to be pegged at 100%? Something's wrong if it isn't, especially since rendering is supposed to be easily parallelizable. Every game I've ever played uses as much cpu as it can. In this case of software rendering, it seems like something's going wrong, and maybe data isn't moving around as fast as it needs to be and clogging the pipes.

                  Comment


                  • #10
                    I'm pessimistic about GLSL from Intel, because it's from Intel who for years couldn't even make their Linux drivers perform decent compared to its window$ counterpart. I wish the Nvidia devs were doing this work instead, not flames, I really think so.

                    Comment

                    Working...
                    X