Announcement

Collapse
No announcement yet.

How Valve Made L4D2 Faster On Linux Than Windows

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by BO$$ View Post
    Seriously I was expecting a better design from valve. If you have seen irrlicht or ogre renderer for example, you would have known that a renderer should be independent from opengl or dx. you write abstract classes that you then implement for opengl 1.5 or 2.0 or dx 9 or dx 11 and so on. If valve would have done so there would be no sense in translating dx calls to opengl. I mean WTF kind of design is that? Are they using any kind of OOP design at all? It just shows that sometimes even the "best" programmers fuck it up. Valve you dissapoint me...
    Using an even higher level API incurs a performance penalty when translating to OGL/DX/WhateverGraphicalAPI. For games, where performance matters, this is unacceptable.

    Comment


    • #32
      Originally posted by BO$$ View Post
      No not really. Implementing some abstract classes won't kill performance. Do you actually think irrlicht and ogre are so inefficient? Any reasonable developer wraps the api in something else. In their case they didn't wrap it correctly enough so they could rewrite the implementation without rewriting the higher level api. That means bad design. And implementing a translator between d3d to opengl really does incurs a performance penalty. More than doing it right the first time with abstract classes...
      My guess is that the Source engine wasn't meant to be cross platform. Thus there was no point in wrapping Win API calls.

      Comment


      • #33
        Originally posted by log0 View Post
        My guess is that the Source engine wasn't meant to be cross platform. Thus there was no point in wrapping Win API calls.

        Exactly...so their options was to do something like they did or rewrite everything from ground up....that last option was not an economical good option .

        .....and if they achieved the kind of performance that they seem to got , it really doesn't matter.

        Comment


        • #34
          Originally posted by BO$$ View Post
          No not really. Implementing some abstract classes won't kill performance. Do you actually think irrlicht and ogre are so inefficient?
          Well, I have to wonder how on earth can an XBOX 360, that comes with an ancient GPU of equivalent power to a Radeon X1950 or Intel HD 3000, pull off those "OMG WOW!" graphics it does while there's no fscking way to do those graphics on an actual X1950 or Sandy Bridge with HD3000.

          It's a natural response to blame the PC APIs for this.

          Edit:
          In other words, can you use Ogre/Irrlicht to implement Skyrim, or the latest Assassin's Creed, or Mass Effect 3, or even Race Driver Grid (runs at constant 60FPS on XBOX360 even when there's 20 cars on screen) on an Intel HD3000 CPU? The XBOX can do those on its old hardware.
          Last edited by RealNC; 08-10-2012, 12:07 PM.

          Comment


          • #35
            Originally posted by gamerk2 View Post
            Using an even higher level API incurs a performance penalty when translating to OGL/DX/WhateverGraphicalAPI. For games, where performance matters, this is unacceptable.
            You clearly have no clue what you're talking about. Please don't go around telling people "facts" if you aren't even remotely an expert in the field. Then other people just have to go and clean up the misinformation you spread.

            Every modern high-end engine has an internal graphics interface. Neither D3D nor OpenGL are suitable for being the direct interface the graphics programmer uses for rendering, as they are both just low level hardware abstraction layers. The highs end engines abstract away shaders and effects systems, render queues, etc.

            If you think that an extra layer on vertex buffer creation and the like is a bottleneck, you are wrong. The bottlenecks are in places that are called tens to hundreds of thousands of times per frame. The graphics API is called only a few hundred times, maybe a couple thousand times, if you're doing your job right as a graphics programmer.

            Comment


            • #36
              Originally posted by BO$$ View Post
              I don't really see what are you trying to prove? you can play those on xbox because you program more to the metal in xbox and don't have to worry so much about a lot of layers that exist on pc. On pc you don't target a specific hardware configuration so that is why for consoles they can optimize better. But don't worry, they still abstract the graphics api especially if they write the engine for both xbox360 and ps3.

              So this is why I'm saying valve did a poor job abstracting the api. They shouldn't have gotten to the point where they translate d3d calls to opengl.
              My point was a reply to whoever claimed that abstractions upon abstractions don't hurt performance. It seems they do. OpenGL and D3D seem to be too high level and performance seems to suffer greatly as a result?

              Comment


              • #37
                Originally posted by elanthis View Post
                You clearly have no clue what you're talking about. Please don't go around telling people "facts" if you aren't even remotely an expert in the field. Then other people just have to go and clean up the misinformation you spread.

                Every modern high-end engine has an internal graphics interface. Neither D3D nor OpenGL are suitable for being the direct interface the graphics programmer uses for rendering, as they are both just low level hardware abstraction layers. The highs end engines abstract away shaders and effects systems, render queues, etc.

                If you think that an extra layer on vertex buffer creation and the like is a bottleneck, you are wrong. The bottlenecks are in places that are called tens to hundreds of thousands of times per frame. The graphics API is called only a few hundred times, maybe a couple thousand times, if you're doing your job right as a graphics programmer.
                Maybe you can help me out.

                I am getting confused.

                Valve claims for years their game engines supports opengl and direct X

                Why do they need to translate direct X calls then?
                Or is this just simpler then adopting the game to opengl ?

                I build some maps for a game, I cant imagine it being that hard.
                You wont have to rebuild the levels ?

                Comment


                • #38
                  Why has no one suggested smart preprocessor usage to eliminate the abstraction layer completely without harming portability? Why make an abstract class that decides when it's called "what platform am I running on?". That's what a preprocessor is for.

                  If you want to compile once for all platforms, you're doing it wrong.

                  Comment


                  • #39
                    Originally posted by elg2001 View Post
                    Why has no one suggested smart preprocessor usage to eliminate the abstraction layer completely without harming portability? Why make an abstract class that decides when it's called "what platform am I running on?". That's what a preprocessor is for.

                    If you want to compile once for all platforms, you're doing it wrong.
                    This. I'm confused by everything that's being said here. Direct3D translators? What is all this crazy talk?

                    Code:
                    void DrawSomething()
                    {
                        #if defined(WIN32)
                            D3D9Blahblah();
                        #elif defined(__linux__)
                            glBlahblah();
                        #elif defined(__APPLE__)
                            glBlahblahAPPLE();
                        #endif
                    }
                    Certainly this is what most devs do right?

                    Comment


                    • #40
                      Originally posted by log0 View Post
                      My guess is that the Source engine wasn't meant to be cross platform. Thus there was no point in wrapping Win API calls.

                      AFAIK
                      leaked 2003 Source (Half-Life 2 beta) had a D3D/OpenGL/Software switch.

                      Comment


                      • #41
                        Originally posted by elanthis View Post
                        For the love of... SONY AND NINTENDO DO NOT USE OPENGL!! At all, period. There is a _proprietary_ heavily-modified "kinda sorta" GL|ES wrapper library for PS3 which essentially nobody uses because making proper use of the PS3 does not fit into the D3D/GL API models. Nintendo has never released any hardware that supports shaders (the Wii U will be their first when it comes out later this year) and their past hardware instead relies on a TEV unit, and also has other various very weird and different hardware behavior that does not at all fit into the D3D/GL model, and they also use a proprietary custom API.

                        NVIDIA, AMD, and Intel all produce Direct3D drivers last I checked. They don't "use OpenGL" any more than they use Direct3D.

                        Google also does not solely use OpenGL as you think. Chrome on Windows exclusively uses Direct3D, despite having an entire OpenGL layer. They went out of their way to engineer ANGLE specifically to solve the massive real-world problems with OpenGL driver quality from the fine, fine folks at NVIDIA, AMD, and Intel.

                        That leaves Apple as the only company in your list that are solely consuming OpenGL and not Direct3D. That alone does not mean anything, of course, but don't go around claiming that somehow it's only Microsoft -- and not practically the entire industry -- that uses Direct3D.

                        Especially when it comes to things like gaming and consumer devices, of which Microsoft's platforms still makes up a majority, with Apple and Google picking up the brunt of the weight in the low-end mobile space.
                        There's a lot of poop in that post.

                        nVidia, AMD and Intel produce direct3d drivers ONLY for microsoft OSes. They produce OpenGL driveres for windows and THE REST. But I don't think you ment to say that.

                        Yes, Google uses directx for chrome on windows. But why was that again, due to performance and future compatibility (MS was claiming to remove opengl etc). So far, it still holds, that D3D only works and is used on windows. And to say it again, OpenGL works on pretty much every platform. Even if the ps3 ES version is slow, it still exists. D3D, only exists on windows and xbox (and maybe windows phone something in some time). So most if not all the original statements hold, and you sir, talk poop.

                        Comment


                        • #42
                          Originally posted by Chaosenemy View Post
                          This. I'm confused by everything that's being said here. Direct3D translators? What is all this crazy talk?

                          Code:
                          void DrawSomething()
                          {
                              #if defined(WIN32)
                                  D3D9Blahblah();
                              #elif defined(__linux__)
                                  glBlahblah();
                              #elif defined(__APPLE__)
                                  glBlahblahAPPLE();
                              #endif
                          }
                          Certainly this is what most devs do right?
                          Assuming those are inlineable (word?) functions during compilation, that's exactly what I was talking about.

                          Comment


                          • #43
                            Originally posted by oliver View Post
                            They produce OpenGL driveres for windows and THE REST.
                            No, they don't. They produce OpenGL drivers for Windows and Linux (and a few other Linux-like OSes). Nothing else. OS X's drivers are written by Apple. The other platforms, again, do not use OpenGL (or D3D).

                            Windows, incidentally, is one of the few platforms that actually gives its developers a choice of which API to use. For a long time, they made the choice of OpenGL, because it was much better than the early D3D crap that Microsoft was peddling. Microsoft -- unlike Khronos -- actually iterated on and improved their API. Microsoft -- unlike NVIDIA, AMD, or Intel -- put a huge amount of time and effort into engineering fixes for the stability and security problems that NVIDIA's, AMD's, and Intel's (and the other players in the video card market that have mostly disappeared) drivers all had and still have today.

                            That's why Linux users get to go on and on and on about how Linux is a more secure OS while their fucking video drivers insert local root exploits directly into ring-0 and Windows users have an OS that recognizes that video drivers are scary huge complex beasts and completely isolates the driver from the app. On Linux, frequent video driver problems cause the entire desktop to boot out, the machine to seemingly lock up with no way to restore graphics, or just causes a kernel oops or a reboot. On Windows, your game gets a signal that it lost connection to the device, your screen goes black for a few seconds while the driver is restarted, and then you're either back in the game or at an otherwise unmolested desktop (with all your other apps still open, no data lost, etc.).

                            Think about it. Linux's DRI is all about letting userspace code directly submit command streams to the kernel. It essentially has to implement a helluva-complex security whitelisting/blacklisting model in order to ensure that apps can't do anything not-nice. A saner approach forces the API layer (OpenGL, Direct3D, whichever; doesn't matter) over IPC to another, secured process that is the only one allowed to talk to the hardware/kernel. But hey, Linux OpenGL gets higher FPS in L4D2 than Direct3D because its draw call overhead is lower due to a lack of extra context switches! Let's talk about that instead, because it makes Linux look better.

                            I'm not generally on the micro-kernel bandwagon, but certainly some things deserve to go that model more than others do.

                            And this is the reason why I keep bitching about these things. There's no some kind of magical unicorn fairy properties about D3D vs OpenGL. It's just engineering and tech. It's 100% solvable problems. But people keep making decisions based on politics, emotions, and cluelessness rather than technical merit. And so Linux continues to be a nerd hobbyist OS on the desktop and a Web server in the Enterprise and a barely-recognizable mutated android on mobile devices, despite being so awesome in so many other ways. Instead of being a defensive jerk, you could actually _do_ something to close the gap between Windows and Linux in this space. God knows I'd be thrilled if there was one with enough mass behind it to actually succeed. Instead, you want to reflexively rage against anyone daring to imply that Microsoft managed to do something right that Linux and OpenWhatever does wrong. The community could quite probably put together an API even better than D3D11 by a long shot. it's not like D3D11 is perfect; it's just that it's far far far less shitty than OpenGL. But nope; most of you have no freaking clue what OpenGL even does or how it works or how to use it, but you'll be damned if you're gonna let some uppity dude tell you that an API isn't well designed or implemented.

                            Yes, Google uses directx for chrome on windows. But why was that again, due to performance and future compatibility (MS was claiming to remove opengl etc).
                            WebGL and ANGLE were not written until years after the Vista OpenGL scare was long past. Performance was also not a major concern for this use case. You're making wild assumptions rather than stating facts. I on the other hand have very direct knowledge about what went into the decision to use D3D/ANGLE for Chrome and Mozilla on Windows.

                            Talking poop indeed, sir.

                            So far, it still holds, that D3D only works and is used on windows.
                            D3D is actually used on more platforms if you want to talk about translation layers that let developers code to D3D. That is what Transgaming has allowed for years, and apparently is what Valve did for Source. You obviously don't want to include that option, though, so let's just ignore it, and say D3D is only for Windows/XBox.

                            Wait, there have been efforts on writing native Gallium D3D drivers... but nah, the community rejects those because Microsoft is so so very evil, so let's ignore those, too. D3D is only for Windows/XBox.

                            I suppose I could bring up how Gallium internally is built on an API very similar to D3D10/11's, because APIs designed for actual real modern hardware are going to be pretty similar, and very different than OpenGL, but that's an entirely different tangent. D3D is only for Windows/XBox.

                            And to say it again, OpenGL works on pretty much every platform. Even if the ps3 ES version is slow, it still exists.
                            Except that it doesn't. There's an OpenGL ES derived API layer that is incompatible with OpenGL ES code you write for every other modern OpenGL ES based platform. It uses a different shading language than OpenGL ES, exposes features comparable to OpenGL ES 2.0 but does so in a non-standard way over an OpenGL ES 1.1 derived API, requires the use of non-standard proprietary extensions, and is generally about as much like OpenGL as Budweiser is like beer.

                            You're also still conveniently ignoring the other platforms that don't have anything that even remotely smells like OpenGL, which were listed as users of OpenGL.

                            Really though, even if PS3 had a complete desktop OpenGL 2.1 stack, the fact remains that it's unused. It's still not truthful to mention Sony as a "user" of OpenGL in a meaningful capacity. One might as well say that Microsoft is a user of Bob even though nobody external or internal to the company uses the damn thing and it's abandoned and unsupported.

                            Comment


                            • #44
                              where can I find more information about porting games from Windows to Linux? I am trying to understand all these terms (such as abstract layer, dynamic translate, wrapper, API etc).

                              PS: (please don't link me to Google...). Any links or books would be a better option. If you would like send it via PM.

                              Thanks

                              Comment


                              • #45
                                Originally posted by elanthis View Post
                                That's why Linux users get to go on and on and on about how Linux is a more secure OS while their fucking video drivers insert local root exploits directly into ring-0 and Windows users have an OS that recognizes that video drivers are scary huge complex beasts and completely isolates the driver from the app. On Linux, frequent video driver problems cause the entire desktop to boot out, the machine to seemingly lock up with no way to restore graphics, or just causes a kernel oops or a reboot. On Windows, your game gets a signal that it lost connection to the device, your screen goes black for a few seconds while the driver is restarted, and then you're either back in the game or at an otherwise unmolested desktop (with all your other apps still open, no data lost, etc.).
                                .
                                BS !!!

                                I stopped reading right there !!!

                                You Sir, clearly didn't play enough Windows games !!! (...and i'm not talking about Indie games, i'm talking about AAA games ! )

                                I can assure you that also in Windows i had complete freezes of games because of video drivers and self-reboots !!!

                                Comment

                                Working...
                                X