Announcement

Collapse
No announcement yet.

Windows 8 Outperforming Ubuntu Linux With Intel OpenGL Graphics

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #71
    Originally posted by elanthis View Post
    The "cheating" is a bit different (and I don't know that Intel has ever been found to do it, though both AMD and NVIDIA have). The cheating is that the drivers can detect popular benchmarks or games and make changes to the rendered scene to improve performance.

    A famous example is when one driver cheated hard at LightsMark and did some special culling of the submitted geometry. Users who modified the benchmark found that if you turned the camera around while using said driver, you'd see broken geometry. On other drivers, you'd see the bits of the scene you'd expect.

    This is a popular tactic for both common benchmarks and newer games. It lets the driver authors publish much higher benchmark numbers for a stock configuration of the application, tricking users into buying their hardware/driver despite the fact that it is no faster in the general case or on app configurations they didn't cheat at, or that games might break if the user does something out of the ordinary on a cheated configuration.

    That's unlikely in FOSS drivers because all that code is a huge maintenance burden that only helps in marketing products, something that FOSS projects are not generally interested in. Also, spilling the beans on why a particular benchmark is so good kind of defeats the purpose of cheating at the benchmark in the first place.

    Again, there's no evidence I know of that Intel's Windows drivers are guilty of this at all.
    This is very interesting info for me! Could you provide some link to the LightsMark broken geometry issue? Thanks

    Comment


    • #72
      Would the AF quality count as cheating?

      Earlier Intel Windows drivers had terrible AF quality, but with driver tweaks they got it to filter properly (and slower). So either a real oversight, or attempted cheat that was removed when detected.

      Comment


      • #73
        ...

        The title of this disgusting brown nosing article is very misleading. It should be "Windows 8 CLOSED SOUCE DRIVERS perform better than LINUX OPEN SOURCE DRIVERS, but Linux CLOSED SOURCE DRIVERS PERFORM THE SAME AS WINDOWS 8 DRIVERS with OpenGL" That should be the title, clueless shill. BTW, Valve games run much better for me than on my wife's shitty Windows 8 computer.

        Comment


        • #74
          Originally posted by flooby View Post
          The title of this disgusting brown nosing article is very misleading. It should be "Windows 8 CLOSED SOUCE DRIVERS perform better than LINUX OPEN SOURCE DRIVERS, but Linux CLOSED SOURCE DRIVERS PERFORM THE SAME AS WINDOWS 8 DRIVERS with OpenGL" That should be the title, clueless shill. BTW, Valve games run much better for me than on my wife's shitty Windows 8 computer.
          That remark is
          a) insulting beyond bounds
          b) suggests adjustment to the title for nuance

          There is no reason to adjust the title. Windows GPU drivers are closed source. Linux GPU drivers are either open source or closed source. In the case of Intel, it's only open source. So there is no need to
          a) insult the author of the article
          b) suggest adjustment to the title

          I hope you don't ever reproduce...

          Comment


          • #75
            Originally posted by Rexilion View Post
            That remark is
            a) insulting beyond bounds
            b) suggests adjustment to the title for nuance

            There is no reason to adjust the title. Windows GPU drivers are closed source. Linux GPU drivers are either open source or closed source. In the case of Intel, it's only open source. So there is no need to
            a) insult the author of the article
            b) suggest adjustment to the title

            I hope you don't ever reproduce...

            +1

            Lots of people do not like data so they want to degrade its importance.

            You CAN compare OpenGL perf just on one gpu :P

            Comment


            • #76
              Originally posted by elanthis View Post
              The "cheating" is a bit different (and I don't know that Intel has ever been found to do it, though both AMD and NVIDIA have). The cheating is that the drivers can detect popular benchmarks or games and make changes to the rendered scene to improve performance.

              A famous example is when one driver cheated hard at LightsMark and did some special culling of the submitted geometry. Users who modified the benchmark found that if you turned the camera around while using said driver, you'd see broken geometry. On other drivers, you'd see the bits of the scene you'd expect.

              This is a popular tactic for both common benchmarks and newer games. It lets the driver authors publish much higher benchmark numbers for a stock configuration of the application, tricking users into buying their hardware/driver despite the fact that it is no faster in the general case or on app configurations they didn't cheat at, or that games might break if the user does something out of the ordinary on a cheated configuration.

              That's unlikely in FOSS drivers because all that code is a huge maintenance burden that only helps in marketing products, something that FOSS projects are not generally interested in. Also, spilling the beans on why a particular benchmark is so good kind of defeats the purpose of cheating at the benchmark in the first place.

              Again, there's no evidence I know of that Intel's Windows drivers are guilty of this at all.

              I'm really glad that someone understands. Games and benchmarks speak to the driver, not directly to the hardware. If the driver wants to cheat will cheat, there is not technology available to measure quality of the picture. In fact when you have 2x GPUs you only have +50% FPS, that's is because the driver goes in quality and precision mode, same with double the shaders. My opinion is this:

              NV-Kepler= [email protected]_(Intel comparison) = [email protected]_(AMD comparison) = [email protected]=trioperant_(AMD HD2000-6000, G80-300, PS3, XBOX360 comparison).

              AMD-HD7000= [email protected] = [email protected]=trioperant.

              Intel-4000= [email protected] = [email protected] [email protected]=trioperant.

              Also there is not an exact way to compare open source drivers with the closed ones, because the closed ones cheat. If you ask me Intels_open and Intels_closed are equals. Also they share the same OpenGL code. How the hell some of you figure out that are different? Make your brain think!

              Comment


              • #77
                Originally posted by artivision View Post
                I'm really glad that someone understands. Games and benchmarks speak to the driver, not directly to the hardware. If the driver wants to cheat will cheat, there is not technology available to measure quality of the picture. In fact when you have 2x GPUs you only have +50% FPS, that's is because the driver goes in quality and precision mode, same with double the shaders.
                The title of the article mentioned testing the drivers. Not the cards. So what's the point?

                Furthermore, FPS is only a single (one of many) ratio's defining 'performance'. So an 50% increase in FPS will not increase performance with 50%.

                Originally posted by artivision View Post
                My opinion is this:

                NV-Kepler= [email protected]_(Intel comparison) = [email protected]_(AMD comparison) = [email protected]=trioperant_(AMD HD2000-6000, G80-300, PS3, XBOX360 comparison).

                AMD-HD7000= [email protected] = [email protected]=trioperant.

                Intel-4000= [email protected] = [email protected] [email protected]=trioperant.
                I have no idea what this is about.

                Originally posted by artivision View Post
                Also there is not an exact way to compare open source drivers with the closed ones, because the closed ones cheat.
                The fact that closed source drivers does not make them cheat by definition.

                Originally posted by artivision View Post
                If you ask me Intels_open and Intels_closed are equals. Also they share the same OpenGL code.
                First you assert that closed source drivers cheat, and now you say that in the case of Intel they don't because they share some code for rendering.

                I don't think that the DRM intel driver part is shared with Windows.

                Originally posted by artivision View Post
                How the hell some of you figure out that are different? Make your brain think!
                I just did, and you speak nonsense.

                Comment


                • #78
                  Originally posted by artivision View Post
                  Also there is not an exact way to compare open source drivers with the closed ones, because the closed ones cheat. If you ask me Intels_open and Intels_closed are equals. Also they share the same OpenGL code. How the hell some of you figure out that are different? Make your brain think!
                  The Intel open and closed source OpenGL drivers share no code whatsoever. They're completely independent.

                  Comment


                  • #79
                    Originally posted by Rexilion View Post
                    The title of the article mentioned testing the drivers. Not the cards. So what's the point?

                    Furthermore, FPS is only a single (one of many) ratio's defining 'performance'. So an 50% increase in FPS will not increase performance with 50%.



                    I have no idea what this is about.



                    The fact that closed source drivers does not make them cheat by definition.



                    First you assert that closed source drivers cheat, and now you say that in the case of Intel they don't because they share some code for rendering.

                    I don't think that the DRM intel driver part is shared with Windows.



                    I just did, and you speak nonsense.

                    The numbers above are the real Gflops of today's GPUs. Im sure that Radeon losers don't like it. How is it possible a 3.8Tflops card to be near a 6.4Tflops one. Obviously Closed=cheating.

                    Intels_closed and Intels_open haven't the same source, but they share lines of code and technology. Not two different departments. When Closed one gets something, after that the Open one gets the same. They probably have different configuration, that's why the 10% difference in frames.

                    Comment


                    • #80
                      Originally posted by artivision View Post
                      The numbers above are the real Gflops of today's GPUs. Im sure that Radeon losers don't like it. How is it possible a 3.8Tflops card to be near a 6.4Tflops one. Obviously Closed=cheating.
                      Still don't understand your point.

                      Originally posted by artivision View Post
                      Intels_closed and Intels_open haven't the same source, but they share lines of code and technology. Not two different departments. When Closed one gets something, after that the Open one gets the same. They probably have different configuration, that's why the 10% difference in frames.
                      Doubt it. Did it ever occur to you that the Linux driver has to things different with the Linux kernel and the Windows driver has to do it differently with the Windows kernel? It's just not only the driver itself that matters.

                      And to be honest, 10% performance decrease is perfectly ok if that means I get to run Linux instead of Windows XP.

                      Comment

                      Working...
                      X