Announcement

Collapse
No announcement yet.

Valve's L4D2 Is Faster On Linux Than Windows

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #81
    What I read Left 4 Dead 2 uses DirectX9 and OpenGL 2.x so the speed may not be very representative of DirectX11 vs. OpenGL 4.2. And we already have Unigine Heaven with both renderers in a quite modern state...

    Comment


    • #82
      Originally posted by ChrisXY View Post
      What I read Left 4 Dead 2 uses DirectX9 and OpenGL 2.x so the speed may not be very representative of DirectX11 vs. OpenGL 4.2. And we already have Unigine Heaven with both renderers in a quite modern state...
      There is a comment from Linux Valve Team that talk about Dx10 and OpenGL 3.2

      Comment


      • #83
        Originally posted by RussianNeuroMancer View Post
        Valve already confirm two or three times (in comments to blog-post) that the quality of render is the same between Direct3D and OpenGL.
        They also write, they keep default settings, and they also write they test game not only with Unity. Read all their comments.
        That info about other beside Unity was made after my post...

        It is important to notice that the test was made under Unity taking in account it's performance (as in : speed) be inferior to XFCE or LXDE.

        Comment


        • #84
          3.9% difference? People are raving over 3.9%? :-/ When getting down to normal framerates (60FPS), 3.9% translates to 2FPS...

          People, seriously. That's what you're so psyched up about? About 2FPS?

          Comment


          • #85
            Originally posted by RealNC View Post
            3.9% difference?
            Tell me, how you count?

            Comment


            • #86
              Originally posted by Dukenukemx View Post
              I really doubt that OpenGL looks worse then Direct X. If anything it would probably look better.
              ... why? There's no reason to think either would look better. The hardware/drivers and the quality of the shaders and source content is what makes something look good, not the API.

              Originally posted by Dukenukemx View Post
              Microsoft makes it pretty easy to develop for Windows, and most game developers only focus on Windows and Xbox. PS3 is the only OpenGL game and even then, not worth the effort for developers.
              Repeat after me: THE PS3 DOES NOT USE OPENGL. I'm really sick of that myth. There is a GL|ES derived proprietary API available on PS3, but essentially nobody uses it. The only console that even comes close to using any PC graphics API is the Xbox, but while we're on the subject, even many XBox games partially bypass D3D today for extra efficiency on the 7 year old locked-in-stone hardware design.

              Originally posted by Linuxxx View Post
              I wonder if he will ever reply...
              You know, that Phoronix forum member who thinks he is a graphics API god but is really nothing more than a Microsoft fanboy.
              Hi darling.

              I'm really touched that you remembered me after my not posting here in... how many weeks? And then decided to write a post just about me, and how much you miss me, and how much you wish you could angrily tell me off, if only you could just see me one more time. Creepy, dude. ... I really hope you don't have pictures of me taped up to a wall inside your closet or something. I know I look great in a bathing suit and all, but boundaries, man.

              > Always saying how much better and faster Direct3D is compared to OpenGL

              Still true. D3D is much better. The API is also still faster. One engine's benchmarks across three completely different OS+driver stacks does not prove much, especially when it's at odds with many other engines' cross-API benchmarks. That's like saying that since one person's favorite color is red that all people's favorite color is red while a whole group of people are screaming at you that they prefer blue.

              > saying that you can't really have multi-threading with OpenGL...

              OpenGL's API really does not allow multi-threading. Some optional platform-specific OpenGL bindings (the EGL/WGL/AGL/GLX bits) do kind of allow (with a great huge pain in the ass set of cludges) the ability to create an extra GL context per thread, share resources between them, and marshal rendering commands through the main thread. Which is awkward because OpenGL is still retarded and still combines both the device context and the output surface into the same hidden magic global object, unlike a sane API where each are distinct API-exposed interfaces. OpenGL's threading possibilities are non-portable and very likely to cause your driver to crash is you look at it wrong; possibly your entire OS can crash.

              If you're a company with Valve's pull and have the ability to tell the hardware vendors to fix their broken drivers, that's one thing. If you're the other 99% of small/indie/hobbyist developers and you run into one of those problems, you're stuck bent over a table. Which is the actual industry experience. Google some interviews with indie devs about OpenGL; user support costs due to driver bugs are huge, far larger than they are with D3D. Or heck, just ask id about Rage and the large amount of negative user feedback caused by the shoddy drivers and the issues they caused, especially for those poor saps who don't update their video drivers the second a new one comes out.

              > Probably too embarrassed now to see why he just isn't competent enough to work for ValvE.

              That seems to be in contradiction to direct evidence.

              > Have fun convincing yourself that OpenGL isn't worth it, while real programmers use it just fine instead of whining on the Phoronix forums

              You are both whining on Phoronix forums _and_ you're not a real programmer at Valve. Just sayin'.

              Originally posted by Nevertime View Post
              'This experience lead to the question: why does an OpenGL version of our game run faster than Direct3D on Windows 7? It appears that it?s not related to multitasking overhead. We have been doing some fairly close analysis and it comes down to a few additional microseconds overhead per batch in Direct3D which does not affect OpenGL on Windows. Now that we know the hardware is capable of more performance, we will go back and figure out how to mitigate this effect under Direct3D.'
              This is a well known thing. D3D on Windows has a higher overhead per draw call due to the separate-process WDDM model. The D3D driver is very well insulated from user space (shared state) and the kernel (for stability). This alone is the primary reason why BSOD's are so incredibly rare on Windows Vista/7 compared to XP (or, in my experience, kernel panics in Linux). Microsoft found that the NVIDIA and AMD drivers caused over 90% of all BSOD's and explicitly designed WDDM to work around the problem since they didn't have the ability to force the vendors to write quality drivers (same problem Linux has with the vendors).

              The GL drivers use some of the WDDM infrastructure for general integration, but as they are not forced into the entirety of the WDDM model, the driver vendors are free to strip away some layers. The result can be less latency in draw calls... at the cost of greatly reduced stability and safety. Sure, the Linux DRI/DRM stack has less overhead, but then the Linux GL drivers are actually capable of causing kernel oopses quite easily (back when r600g was still fairly new, I was getting around 2-3 per week, with just basic desktop usage). +10% performance on one specific engine benchmark that already runs at well over the minimum necessary speed is maybe not worth the loss in stability in security to most people who actually think about it.

              In practice, this draw call latency doesn't matter a whole lot unless you're making a _lot_ of draw calls. Which under-utilizes the hardware anyway. So don't do that.

              Also, recall that this difference in draw call latency is entirely an _implementation_ issue, not an API issue. If you slapped a D3D11 state tracker onto Gallium, you would not find any slowdown or draw-call overhead on D3D11. Quite the opposite, since the D3D11 state tracker would be a fraction of the size of the OpenGL one; just like Gallium itself was designed based on how hardware works and what drivers need to do internally, so was D3D10/11.

              Point being, comparing implementations of OpenGL and D3D is not a direct API-to-API comparison, especially when on two different OSes. Huge parts of the stack underneath the API can be implemented totally differently, and extra speed does not always come for free.

              Originally posted by darkbasic View Post
              Is it faster on the Intel linux driver compared to windows? I don't care for blobs...
              I was under the impression that Linux driver was already way faster than the Windows one, no? I mean, it's not hard; the Intel Windows driver is the largest pile of shit ever produced by mankind.

              Originally posted by artivision View Post
              OpenGL has extensions(ARB) of the protocol build inside GPUs.
              As a nit, extensions have nothing to do with a "protocol built inside GPUs." Maybe it's a language barrier thing, but saying that makes it sound like you don't understand how a GPU actually works, nor a graphics API for that matter.

              Most developers _hate_ extensions. They don't want to have to deal with them. They don't want to have to sprinkle their code with tons of "if feature A then X, else if feature B then Y, else if version C then Z, else tell the user to stop using Intel and go buy a video card with supported drivers." It's a pain in the ass, it takes more time to develop, it takes WAAAY more time to test and QA, etc.

              Developers want to say, "you need a year 2010 or later video card" and then write against a stable well-known always-available API for that class of hardware. That's essentially what D3D gives them. OpenGL botched it with OpenGL 3.x, given that it was meant to compete with D3D10 but yet didn't had most of the major features until 3.2. You can't rely on OpenGL 3 actually meaning anything in terms of support. And then there's Intel, who only just recently added GL 4.2 support to their newest D3D11 hardware, and left all their older chips stuck at GL 3.1 (or worse) despite the hardware having support for D3D 10.1. Yay Intel.

              Funnily enough, I just got into this very argument with a couple of devs from another local studio about this yesterday. They were totally on the anti-Win8, pro-Steam-on-Linux bandwagon and yet still hate OpenGL and its god-awful API. I recall the phrase "**** extensions, who the **** wants to deal with that ****ing ****" being drunkenly shouted at me in an otherwise quiet Kirkland bar last night, by a guy who I'm pretty sure has never written a line of D3D code in his life.

              There's certainly a lot of nerds in the game industry, and a lot of them have at least played around with Linux, and are interested in it. I end up being the local Linux expert given my many many years of being a hardcore Windows-hating Linux nerd, so I always get dragged into conversations about it. I obviously am not a huge fan of desktop Linux at all these days, but Valve has put a lot of new industry focus on Linux with their announcement, obviously. Interesting times.

              As it stands right now, Linux is still a pile of crap for the desktop. Valve may or may not help to change that. Given the last 13 years of "next year will finally be the Year of the Linux Desktop" and all the big huge rich companies that have come and gone in desktop Linux's history, I personally have long since given up on Linux ever being anything besides a hobbyist toy on the desktop and a kick-ass server/embedded platform.

              > Important No3: OpenGL is "Open". So you can find it inside GPU drivers on all operating systems, and you can develop libraries for it: Imagination has OpenRL, wile NVidia develops Voxel-Raytracing solution for OpenGL (ID-tech 6).

              OpenGL is not Open. I know it's confusing with the name in there and all. It's a trademarked API that has patented features, same as D3D. There are docs on the API that you can use to implement an OpenGL-workalike. There are docs on D3D that you can use to implement a D3D-workalike. Exact same boat.

              So far as NVIDIA developers, you do realize that NVIDIA, AMD, Intel, Imagination, etc. have all been involved with the D3D specs for a very long time, right? Microsoft doesn't just make up some random guess about what hardware will do and then invent a spec for it. Hardware takes years to come into fruition, and if any vendor ended up being unable to implement part of a new spec it would kill the API dead. The vendors were involved in the design of D3D11 over half a decade ago, and they're very much involved in D3D12's design now. A number of higher profile game developers are also involved, not to mention the ridiculously huge internal userbase of Microsoft itself (95,000 employees, over 150,000 if you count contractors and part-timers) and the subsidiary game studios owned by Microsoft. There is _significantly_ more feedback from actual API users put into D3D than there ever has been in OpenGL. The OpenGL specification process is completely non-transparent and closed off to anyone outside the Khronos GL committee.

              The only major company that ever has pulled an API out of their ass with no input or collaboration with hardware vendors was... wait for it... 3DLabs, the wonderful folks who designed GLSL and large parts of what was supposed to be OpenGL 2. Every time you deal with some awkward screwed up part of GLSL and how it completely mismatches how any piece of hardware ever released has ever worked, you can thank them for that. When you deal with glBindAttribLocation and the (until very recently) non-separable shader model in OpenGL, you can thank them for that. When you look longingly at HLSL and the low-overhead state model of Direct3D that directly maps between what a graphics engine actually does and how hardware actually works, you should remember how much vendor-communication matters compared to a self-assigned "Open"-prefixed name.

              Originally posted by Scali View Post
              I can add another one:
              4: Poor driver support from all vendors except nVidia
              True: NVIDIA's OpenGL drivers are the best.

              Sad fact: NVIDIA's OpenGL drivers are still way less stable and are buggier than their D3D drivers.

              I still, two years later, can totally fubar the rendering performance of any machine (Windows or Linux) with NVIDIA drivers due to an unfixed bug with their GL FBO management. Which, incidentally, you can trigger with their own demos.

              Sigh.

              Comment


              • #87
                ^^^ i missed posts like that.

                Comment


                • #88
                  I want to see some benchmarks for hd 4000 and amd gfx chips. I know that nvidia opengl is basically at the same level, a few % more or less between win+linux, but usually you do not test 100% the same release anyway. Of course 300 fps are nice, but you need only 60 fps for current tfts. Of course when you have got too much money you can certainly get a nv 680

                  Comment


                  • #89
                    so..

                    windows + super drivers vs linux with shitty drivers = linux win by 2 fps.

                    still success imo

                    linux can craft more fps in future, just need to fix some stuff and drivers. windows cant craft more fps.

                    Comment


                    • #90
                      Originally posted by RealNC View Post
                      3.9% difference? People are raving over 3.9%? :-/ When getting down to normal framerates (60FPS), 3.9% translates to 2FPS...

                      People, seriously. That's what you're so psyched up about? About 2FPS?
                      I won't discuss the percentage per se or your numbers....the point is that Linux is FASTER....can you imagine what the MS fanboys will say if it was the OTHER WAY AROUND ?

                      They would start a endless Linux bashing...

                      There was a threading in a news site announcing the Windows 8 start of mass production and there was some bitter discussion between MAC, Win and Linux supporters....i simply copy and pasted the Phoronix article and it dropped like a tactical nuke....for some time there was only silence....when discussion restarted , it restarted in a completely different tone and MS fans were getting a hard(er) time defend windows and the perspective to go Linux started to look more attractive.

                      Comment

                      Working...
                      X