Announcement

Collapse
No announcement yet.

Nvidia's X87 physX anti-CPU cheat

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Nvidia's X87 physX anti-CPU cheat

    http://www.realworldtech.com/page.cf...WT070510142143

    "In particular, through our experiments we found that PhysX uses an exceptionally high degree of x87 code and no SSE, which is a known recipe for poor performance on any modern CPU. "

    nvidia is just cheating like hell to sell more nvidia GPU's!

  • #2
    a German source of the nvidia cheat: http://www.heise.de/newsticker/meldu...s-1036153.html

    Comment


    • #3
      / care

      Comment


      • #4
        Originally posted by RealNC View Post
        / care
        ???? how care?

        Comment


        • #5
          We don't have no goddam physx client/driver/game/whatever in Linux, right? that makes your point a moot

          Comment


          • #6
            Of course there's no CPU based PhysX client on Linux. That's even more proof of nvidia crippling PhysX on CPUs

            Comment


            • #7
              Originally posted by FunkyRider View Post
              We don't have no goddam physx client/driver/game/whatever in Linux, right? that makes your point a moot
              you can play windows games in wine and physX works on the CPU in wine..

              but nvidia slow's down the cpu so much sooo you can't

              Comment


              • #8
                I think that claim is only half true. Forcing the compiler to create inferior code is certainly true.

                But as a GPU company, it would not be sensible to spend resources on creating a multithreaded PhysX CPU runtime. It might even be quite difficult to do that. Certainly not a "half-hour job by a competent dev" as is implied in there.

                Comment


                • #9
                  Originally posted by FunkyRider View Post
                  We don't have no goddam physx client/driver/game/whatever in Linux, right? that makes your point a moot
                  Shadowgrounds and Shadowgrounds: Survivor

                  Comment


                  • #10
                    Originally posted by curaga View Post
                    I think that claim is only half true. Forcing the compiler to create inferior code is certainly true.

                    But as a GPU company, it would not be sensible to spend resources on creating a multithreaded PhysX CPU runtime. It might even be quite difficult to do that. Certainly not a "half-hour job by a competent dev" as is implied in there.
                    I think it's a full week job for a competent developer porting and testing all the automatically/manually SSE optimized code to an x87 path.

                    Nowadays, x87 code isn't produced by any compiler.
                    You have to force that.

                    Further optimizing SSE code is yet another thing, and probably would have taken the same resources as studying a crippled code path.

                    Nvidia crippled its own software to milk old and new customers.
                    Remember that there are games that only work with all their features when run on Nvidia hardware.

                    Comment


                    • #11
                      Originally posted by Loris View Post
                      Nvidia crippled its own software to milk old and new customers. Remember that there are games that only work with all their features when run on Nvidia hardware.
                      Then again, there is no concept of standardized bloated "extensions" on GPUs- their developers may have been pushed into going for maximum compatibility rather than highly optimized x86 code, especially considering that the CPU port just isn't a priority.

                      Who knows what happened to make this come to be. All I hear is FUD. In closed source development politics play a huge role. Most project heads will trim any work for dev teams to the absolute minimum possible to get the feature out the door. Optimization and bugfixing comes later, only when people (customers) complain about it.

                      Comment


                      • #12
                        There are no games that are crippled without PhysX support. PhysX has always and will likely always be cosmetic. ATI has a partner program and Nvidia has TWIMTBP. Both are the same experience and both encourage developers to optimise code paths and features which take advantage of vendor specific bullet points. Nvidia isn't doing anything illegal, they are just concentrating their expenditure where it will gain them the most income. It makes sense for them to supply optimised AltiVec codepaths to game console SDKs, because it earns them money, just as makes no sense to do the same for non Nv GPU owners, because it may cost them money. Ageia made the (and still makes, they are rolled into Nvidia now) the PhysX SDK and they never sought to add SSE codepaths, so why should Nvidia?

                        The simple answer is they shouldn't and the person who made that technical analysis also knows this.

                        There is no conspiracy going on but there certainly is a reason for an attempted smear campaign.

                        Comment


                        • #13
                          Originally posted by IsawSparks View Post
                          they never sought to add SSE codepaths
                          Yeah, because they wrote the library in 1995 when Pentium was all the rage.

                          Comment


                          • #14
                            I think the person who wrote that article is not saying they should optimise their code paths for SSE2(+) instructions, but that they should stop forcing the compiler to generate x87 code. The compiler may not produce great SSE code, but even naive SSE2 code should beat x87.

                            The way I read the article, he is not commenting so much on x87 vs SSE as he is commenting on GPU vs CPU. nVidia claim the GPU is the way of the future, but one of their 2-4x faster benchmarks crippled on the CPU, and this is what they quite clearly demonstrate. He emphasises that in crippling the CPU nVidia have done nothing wrong.

                            However, this is at best misleading marketing, and probably is crossing the line into outright lying!

                            Comment

                            Working...
                            X