Page 1 of 2 12 LastLast
Results 1 to 10 of 15

Thread: LLVMpipe With Intel's GLSL2 Compiler

  1. #1
    Join Date
    Jan 2007
    Posts
    15,636

    Default LLVMpipe With Intel's GLSL2 Compiler

    Phoronix: LLVMpipe With Intel's GLSL2 Compiler

    Last month we tested out Intel's new GLSL compiler for Mesa when running the ATI Radeon classic Mesa and Gallium3D drivers to see how this GL Shading Language compiler designed by Intel employee's for their hardware and open-source driver work for the other open-source drivers, since all of the Mesa drivers will be affected once this "GLSL2" compiler is merged into the Mesa code-base by month's end. The experience using Intel's new shader compiler with the ATI Radeon graphics driver worked fine except for Warsow where serious regressions were visible, but in the other games that are capable of running off Mesa, the experience was fine. What we have been curious to test since then with this new OpenGL shader compiler has been the LLVMpipe driver -- a Gallium3D driver we have been very excited about as it finally provides a better software rasterizer for Linux by leveraging Gallium3D and the Low-Level Virtual Machine (LLVM) compiler infrastructure for accelerating the Mesa state tracker atop a modern multi-core CPU that supports SSE4 instructions. We have now finished running tests of the Intel's GLSL2 branch with the most recent LLVMpipe driver code.

    http://www.phoronix.com/vr.php?view=15197

  2. #2
    Join Date
    Oct 2009
    Posts
    2,145

    Default

    Actually, regarding the texture-from-pixmap glx extension, that is NOT NECESSARILY REQUIRED for compiz to operate! The 0.9 branch of compiz uses the "copytex" plugin in its place.

    http://forum.compiz.org/viewtopic.ph...t=10402#p75616

    In other words, you can build the 0.9 branch and test again... you might get compiz-on-cpu.

  3. #3

    Default

    Quote Originally Posted by droidhacker View Post
    Actually, regarding the texture-from-pixmap glx extension, that is NOT NECESSARILY REQUIRED for compiz to operate! The 0.9 branch of compiz uses the "copytex" plugin in its place.

    http://forum.compiz.org/viewtopic.ph...t=10402#p75616

    In other words, you can build the 0.9 branch and test again... you might get compiz-on-cpu.
    Oh yeah, didn't think about Compiz 0.9 yet. Will give that a shot.

  4. #4
    Join Date
    Jul 2009
    Posts
    261

    Default

    so its basically up to 30% performance loss but 1-5% less energy used? heureka!
    is it normalized for the same amount of frames? so virtually Frames/Joule or Joule/Frame?

  5. #5
    Join Date
    Sep 2009
    Posts
    129

    Default

    You realize Open Arena doesn't even use GLSL right? Or I assume it doesn't, as it uses a fairly stock ioQuake3 engine, and ioQuake3 doesn't use GLSL without being patched since last I checked. Which was a few minutes ago. Also, it doesn't *look* like it uses GLSL.

    In fact, do any of those games use GLSL? I'm not sure that any of them do. (Anyone should feel free to correct me on this.)

    Maybe this compiler is also involved in the other types of shaders or it has some kind of "idle overhead," that's the only way I can explain the performance differences measured in other games.

  6. #6
    Join Date
    Oct 2008
    Location
    Sweden
    Posts
    983

    Default

    Quote Originally Posted by MaxToTheMax View Post
    You realize Open Arena doesn't even use GLSL right? Or I assume it doesn't, as it uses a fairly stock ioQuake3 engine, and ioQuake3 doesn't use GLSL without being patched since last I checked. Which was a few minutes ago. Also, it doesn't *look* like it uses GLSL.
    I grabbed the timedemo and config PTS uses and ran it using MESA_GLSL=log which dumps all shaders used to files, and ended up with nothing - so OpenArena doesn't seem to use shaders...

    Regarding the breakage in Warsow, I still can't find a bug filed about this, that's kind of a disappointment.

  7. #7
    Join Date
    Oct 2008
    Location
    Sweden
    Posts
    983

    Default

    I can't seem to find any GLSL usage in Tremulous either?

  8. #8
    Join Date
    Oct 2008
    Posts
    3,244

    Default

    This bug will break shaders in gallium - probably a lot of them.

    https://bugs.freedesktop.org/show_bug.cgi?id=29490

    While the article is interesting, the code is from 10 days ago and that might already be out of date. Intel has been doing tons of work fixing bugs and optimizing GLSL2 in order to get it ready for merging this Friday.

  9. #9
    Join Date
    Jan 2009
    Posts
    88

    Default low cpu usage

    Why wouldn't you want the cpu to be pegged at 100%? Something's wrong if it isn't, especially since rendering is supposed to be easily parallelizable. Every game I've ever played uses as much cpu as it can. In this case of software rendering, it seems like something's going wrong, and maybe data isn't moving around as fast as it needs to be and clogging the pipes.

  10. #10
    Join Date
    Oct 2009
    Posts
    353

    Default

    I'm pessimistic about GLSL from Intel, because it's from Intel who for years couldn't even make their Linux drivers perform decent compared to its window$ counterpart. I wish the Nvidia devs were doing this work instead, not flames, I really think so.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •