Announcement

Collapse
No announcement yet.

Valve's L4D2 Is Faster On Linux Than Windows

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #91
    Since Source is already ported to Mac i would like to see a comparison between it an Linux, it would also be interesting to know if source on mac benefited from the linux port in any grate way.

    Comment


    • #92
      Originally posted by elanthis View Post
      ... why? There's no reason to think either would look better. The hardware/drivers and the quality of the shaders and source content is what makes something look good, not the API.

      ...

      Sigh.
      What Valve shows is that (no matter how much you hate OpenGL) it is still possible to write code that performs as fast if not faster than using the MS API. So it is more a proficiency issue.

      Yes OpenGL might be not noob friendly and carrying some old cruft around. But it gets the job done. Noobs/indies should not have to touch OpenGL at all, use an engine that abstracts the renderer imho.

      Btw driver quality is a chicken-and-egg issue. When only few are really taxing the API how can you expect top notch driver support for it?

      Comment


      • #93
        Originally posted by elanthis View Post
        And then there's Intel, who only just recently added GL 4.2 support to their newest D3D11 hardware, and left all their older chips stuck at GL 3.1 (or worse) despite the hardware having support for D3D 10.1.
        Does it actually works? I mean gaming experience.
        Originally posted by AJSB View Post
        There was a threading in a news site announcing the Windows 8 start of mass production and there was some bitter discussion between MAC, Win and Linux supporters....i simply copy and pasted the Phoronix article and it dropped like a tactical nuke....for some time there was only silence....when discussion restarted , it restarted in a completely different tone and MS fans were getting a hard(er) time defend windows and the perspective to go Linux started to look more attractive.
        So, where we can read it?

        Comment


        • #94
          Originally posted by RussianNeuroMancer View Post
          Does it actually works? I mean gaming experience.
          So, where we can read it?
          Do you understand Portuguese language ?!?

          Comment


          • #95
            Originally posted by RussianNeuroMancer View Post
            Does it actually works? I mean gaming experience.
            I have a laptop from ~2007, which has a DX10-capable Intel IGP X3100.
            I have been able to run Crysis on that, in DX10 mode. The framerate may have been poor and not very playable, but technically it was able to handle everything.
            On the OpenGL-side it only supports OpenGL 2.0 (which is equivalent to base-level Direct3D9/SM2.0, rather than the D3D10/SM4.0 that the chip is capable of). I can run neither Doom 3 nor Rage on it, despide both being considerably less graphically challenging than Crysis in DX10 mode.
            So it's purely being held back by the OpenGL driver, not the hardware.
            If I had proper drivers (or if the games were written with D3D), I'm sure that it would be fast enough to play Doom 3 and Rage.
            But sadly, I get good DX10 drivers, but no OpenGL.

            Comment


            • #96
              Originally posted by NomadDemon View Post
              windows + super drivers vs linux with shitty drivers = linux win by 2 fps.
              No, they used the nVidia linux drivers, which are highly optimized. Aside from that, the OpenGL code for nVidia's binary drivers is shared between all OSes. So there is no quality difference in the OpenGL implementation. The only difference is in OS-specific parts of the driver, but nVidia has gone to great lenghts to optimize their linux drivers (even reimplementing a large part of Xorg's resource handling in the driver).
              Last edited by Scali; 03 August 2012, 07:50 AM.

              Comment


              • #97
                Originally posted by Scali View Post
                No, they used the nVidia linux drivers, which are highly optimized. Aside from that, the OpenGL code for nVidia's binary drivers is shared between all OSes. So there is no quality difference in the OpenGL implementation. The only difference is in OS-specific parts of the driver, but nVidia has gone to great lenghts to optimize their linux drivers (even reimplementing a large part of Xorg's resource handling in the driver).
                They also used NVIDIA drivers in Windows 7....and they are also highly (maybe even more than in Linux almost for sure) optimised in Windows.

                Comment


                • #98
                  Originally posted by AJSB View Post
                  They also used NVIDIA drivers in Windows 7....and they are also highly (maybe even more than in Linux almost for sure) optimised in Windows.
                  I never said otherwise. I just said that 'shitty drivers' is not quite correct.
                  It's comparing apples and oranges, but at the least we can say both drivers are optimized quite well.
                  I'm quite sure that we won't see the same results on AMD hardware. My experience has always been that AMD's OpenGL performance is good in Windows, near the level of nVidia's OpenGL, and in line with D3D performance on the same hardware as well... But on linux you take a considerable hit with AMD (I assume AMD also shares the OpenGL codebase between Windows and linux, so that would mean that the difference is mostly caused by the other optimizations that nVidia put in, where AMD just sticks to standard Xorg stuff etc).

                  Comment


                  • #99
                    Originally posted by RussianNeuroMancer View Post
                    Tell me, how you count?
                    315 * 100 / 303 = 103.96%

                    So if we take Windows OGL as the base (100%), then Linux OGL performance is 103.96%. That's 3.96% higher.

                    With a 60FPS base, a 3.96% difference translates to:

                    60 * 0.0396 = 2.37FPS

                    You get 2.3-2.4 more FPS on Linux. It's elementary school math ^_^

                    Edit:
                    Oh, and obviously the lower the overall performance is (for example extreme graphics that drop to 30FPS or lower), the lower the FPS difference. For 30FPS scenes, you only get 1FPS more on Linux.
                    Last edited by RealNC; 03 August 2012, 08:01 AM.

                    Comment


                    • 2.374 fps are at least 10K points _more_ in the 3DMark 2014 Benchmark :-)
                      Last edited by fritsch; 03 August 2012, 08:02 AM. Reason: forget the _more_ while doing a joke

                      Comment

                      Working...
                      X