Announcement

Collapse
No announcement yet.

How Valve Made L4D2 Faster On Linux Than Windows

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • How Valve Made L4D2 Faster On Linux Than Windows

    Phoronix: How Valve Made L4D2 Faster On Linux Than Windows

    Following this morning's Here Is Valve's Source Engine Left 4 Dead 2 On Linux article, here is most of the details that were shared during yesterday's SIGGRAPH presentation about Left 4 Dead 2 running natively on Linux with OpenGL and outperforming the Windows version...

    http://www.phoronix.com/vr.php?view=MTE1NzE

  • #2
    Here's my report on the talk:

    http://www.forceflow.be/2012/08/09/v...l-anniversary/

    Comment


    • #3
      What's stopping Valve now to create a gaming console based on Linux? It would beat the pants off XBOX and PS3 even with a slightly older graphics card. Ouya needs competition!

      Comment


      • #4
        Dynamic translator vs Preprocessor

        I'm wondering if this work on the dynamic API translator between D3D and OpenGL is worth the work.
        May be they could use it as a preprocessor before compiling their game engine, thus they would avoid the involved overhead.

        Comment


        • #5
          So the game isn't truly OpenGL? It just dynamically translates Direct 3D to OpenGL? That's still impressive, but couldn't they get a bigger performance boost by replacing Direct 3D with OpenGL?

          Also, couldn't WINE do this for a performance boost?

          Comment


          • #6
            I would imagine they did this to port their existing games and will do it properly in Source 2. Maybe that's even one of the reasons for jumping to a new major version.

            Comment


            • #7
              Originally posted by Dukenukemx View Post
              So the game isn't truly OpenGL? It just dynamically translates Direct 3D to OpenGL? That's still impressive, but couldn't they get a bigger performance boost by replacing Direct 3D with OpenGL?

              Also, couldn't WINE do this for a performance boost?
              No cause any-direct3d-application isn't a fixed target as the source engine is internally.

              Comment


              • #8
                Originally posted by Dukenukemx View Post
                So the game isn't truly OpenGL? It just dynamically translates Direct 3D to OpenGL? That's still impressive, but couldn't they get a bigger performance boost by replacing Direct 3D with OpenGL?

                Also, couldn't WINE do this for a performance boost?
                From what I understand...
                The bottleneck would be CPU bound instead of GPU bound.
                Multiple threads have to be used in order to translate D3D to OpenGL so that there isn't a huge bottleneck.
                Last edited by SolidSteel144; 08-09-2012, 05:42 PM.

                Comment


                • #9
                  Maybe with the new Source engine they could go the other way around. Translate from OpenGL to Direct3D dinamically. There's only two systems using Direct3D: Xbox and Windows. Everyone else is using OpenGL or an equivalent (Sony, Nintendo, AMD, NVIDIA, Intel, Apple, Google ...)

                  Comment


                  • #10
                    Originally posted by Filiprino View Post
                    Everyone else is using OpenGL or an equivalent (Sony, Nintendo, AMD, NVIDIA, Intel, Apple, Google ...)
                    Sony PS3 uses low-level libGCM (no one is using PSGL derived from GL ES 1.1+Cg), Nintendo Wii uses proprietary GX API (GX2 in Wii U).

                    Comment


                    • #11
                      Originally posted by kwahoo View Post
                      Sony PS3 uses low-level libGCM (no one is using PSGL derived from GL ES 1.1+Cg), Nintendo Wii uses proprietary GX API (GX2 in Wii U).
                      And what do you think those libraries are based on? PS3 uses a FreeBSD OS derivative too.

                      Comment


                      • #12
                        - Yes, the way the Source Engine is hitting on OpenGL right now is through a non-deferring, locally-optimizing abstraction layer to basically convert their longstanding Direct3D calls into OpenGL. However, it's not the same way that Wine does Direct3D to OpenGL conversion. The Source Engine targets a D3D9-like API with extensions that translates GL calls dynamically. This also works for Shader Model 2.0b with Shader Model 3.0 support coming soon. Valve's implementation is nearly a 1:1 mapping between D3D and GL concepts.
                        When they say it also works for SM2.0b and soon SM3.0, does that mean it's currently translating the vanilla SM2.0 codepath? I wonder if the Linux/OpenGL performance advantage will fall off as they move on to the more complex SM2.0b and SM3.0 paths?

                        Comment


                        • #13
                          Originally posted by Dukenukemx View Post
                          So the game isn't truly OpenGL? It just dynamically translates Direct 3D to OpenGL? That's still impressive, but couldn't they get a bigger performance boost by replacing Direct 3D with OpenGL?

                          Also, couldn't WINE do this for a performance boost?
                          No they're using OpenGL directly. What they mean is that Valve's Source Engine API calls just "look" like the DirectX calls to the game/renderer code, so none of that code has to be change for them to port to OpenGL platforms. There are some areas where DirectX and OpenGL don't match up perfectly, so, for cross-platform code, it's usually best to write ambiguous "higher level" render methods which can fine-tune the direct DX/GL calls to that API's "best practice" respectively. This is most likely what they meant by saying that certain hot-loops and interfaces where re-written to accommodate the differences in graphics APIs.

                          Comment


                          • #14
                            Gamepad?

                            What about gamepad support?

                            Comment


                            • #15
                              turn off -fPIC?

                              Does this mean they use one huge static binary with no shared libraries?

                              Or do they resolve dynamic links at runtime, resulting in selinux violations from writing relocation fixups?

                              Will we have to disable selinux to run their apps?

                              Comment

                              Working...
                              X