Announcement

Collapse
No announcement yet.

PhysX SDK Support Comes Back To Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Knowing that Unreal Tournament 3 replies on PhysX for all its physics needs as well as has a map pack exclusive to those who have PhysX cards, it's a good sign.

    Comment


    • #12
      physx is a bad joke anyway. Every decent CPU would be faster if NVIDIA did not cripple it - and it is pretty hard to force compilers to generate x87 code instead of sse....

      Comment


      • #13
        Originally posted by droidhacker View Post
        VDPAU is most DEFINITELY proprietary. They may have released an API, but they have NOT released the necessary HARDWARE PROGRAMMING DOCUMENTATION to actually implement it. As long as the secret sauce in their blob remains secret, VDPAU remains proprietary.
        You're right that NVIDIA's VDPAU implementation is proprietary, but that doesn't make VDPAU *itself* proprietary.

        Originally posted by droidhacker View Post
        As for APU's not having enough graphics horsepower... To do what exactly? All that they *NEED* to do to virtually WIPE NVIDIA OUT is DRAW THE SCREEN. To top that though, have you actually TRIED an AMD APU?
        I'm sorry, I should have clarified... people also like having games not be a slideshow. I haven't tried an AMD APU because I'd have to replace my whole computer, whereas I can just swap in a different GPU for a fraction of the cost. Also they don't exist yet.

        Really good physics processing absolutely does require a GPU, despite what you might read on SemiAccurate or wherever.

        Comment


        • #14
          and it is pretty hard to force compilers to generate x87 code instead of sse....
          gcc -mfpmath=387

          Comment


          • #15
            Even John Carmack stated that PhysX is just a gimmick that doesn't serve much purpose. Here's a video of him discussing that: link

            I'm glad AMD decided against licensing PhysX from Nvidia. It would add unnecessary cost to their GPUs (however small it may be) and have minimal benefit.

            Comment


            • #16
              Originally posted by BlueJayofEvil View Post
              Even John Carmack stated that PhysX is just a gimmick that doesn't serve much purpose. Here's a video of him discussing that: link
              The company was a joke, but a hardware accelerated physics engine is not. See the realtime cloth and better looking bullet hits/glass breaking in mafia 2:

              Comment


              • #17
                Originally posted by yogi_berra View Post
                The company was a joke, but a hardware accelerated physics engine is not. See the realtime cloth and better looking bullet hits/glass breaking in mafia 2:
                http://physxinfo.com/news/2967/mafia...ects-hands-on/
                All things that a modern quad core processor can't do... can it?

                It's very funny to see that here, on a linux hardware/software prominent site, people still care about such a bunch of crippled and proprietary code.

                Go and give your support to opensource Bullet Physics, instead of celebrating this marketing-addicted x87-compiled blob...

                Comment


                • #18
                  Originally posted by yogi_berra View Post
                  The company was a joke, but a hardware accelerated physics engine is not. See the realtime cloth and better looking bullet hits/glass breaking in mafia 2:
                  http://physxinfo.com/news/2967/mafia...ects-hands-on/
                  The first, the ?clothing module,? will realistically model flowing clothes on what are designated as ?primary characters.?
                  This could be done via software and processed on a GPU without the need for PhysX. PhysX may help accelerate this processing but that doesn't mean it can't be done otherwise.

                  The second APEX module, the ?destruction module,? models realistic damage and deformation on exploding objects, as well as the concussive force of a powerful explosion with an ?invisible force field? that realistically sends any sufficiently light objects (or characters) flying.
                  This has been done before with and without PhysX. Again, it may help accelerate it and/or make it easier to program, but that doesn't mean it's necessary.
                  For example, the MMORPGs City of Heroes/Villains implemented PhysX support shortly after AGEIA came to market. You can enable the effects without a PhysX card/Nvidia GPU and let the hardware you do have (CPU/GPU) do the work.

                  It also offers Mafia II an enhanced particle system that creates discrete and unique, procedurally generated debris when you or other characters destroy any of the game?s deformable objects.
                  (emphasis theirs)

                  Many games have done this without PhysX or without need specifically for PhysX-enabled hardware. City of Heroes/Villains, Half-Life 2 and Doom 3 are just a few examples.

                  I'm not going to bother quoting more of that article as I'd just be retyping my already made statements.

                  As multi-core CPU and multi-GPU setups are becoming more common, I don't see the need for special hardware or vendor lock-in (Nvidia) for some added special effect acceleration. PhysX would likely do better in a console.

                  Comment


                  • #19
                    Originally posted by blackiwid View Post
                    I doubt that, the most combination are shurely Netbooks or Notebooks with Intel gpus.

                    And at the moment AMD sells more grafic-cards than Nvidia, I think the most Linux-users don?t buy grafic-cards only for Linux, I give you that in the past Nvidia was the best option for Linux, but the proprietary AMD drivers got much love against the ATI time, and the open ones are getting better day by day.

                    So if you start today a Game or Programm that uses Physix you cannot only look at the current state you have to look into the future, and even if you ignore integrated solutions, Nvidia looses its market domination on the hole market and I think It would be strange to think that under Linux that trend will be completly ignored.

                    And again even If you would be right and there would be enough nvidia cards to programm espacily for them software under Linux, why would you wanna use Linux when you want to use proprietary vendor-lock solutions. Why not using Windows if you don?t care for that.
                    Dude, you are so f..n wrong.
                    I don't know about "general" public, but Linux users around me usually tend to have not only more then one computer, but 6-10 cores (4+2+2+2 or any other combination) with at least 2GB per machine and of course Nvidia card for one simple reason. It does what it supposed to:
                    • 3D acceleration
                    • HW Video de/encoding
                    • Games !!! the very few native available and much more through WINE

                    And of course, it's working and not ONLY on the paper !!!
                    I and most of surrounding me Linux users, using it for something like 10 or more years. It always was this way. It will stay this way for some time.
                    The worst VGA experience I ever had is HD4850 barely bearable under Windows, and useless under Linux. Actually I lied ... It was useful during winter as it made enough heat to warm the space beneath the table.
                    I wont mention laptop graphics.

                    Comment


                    • #20
                      Originally posted by blackiwid View Post
                      So to use this under Linux, you must be one of the few, who use Linux, who have a newer Nvidia Card and use the binary Nvidia drivers. I think that makes not much sense. Why using Linux if you like proprietary vendor-specific solutions, I don?t get it?
                      Lol do you live under a rock or something. Nvidia is the single most used GPU according to Phoronix Graphics Survey 07'-09' and most likely again this year.

                      Originally posted by blackiwid View Post
                      And again even If you would be right and there would be enough nvidia cards to programm espacily for them software under Linux, why would you wanna use Linux when you want to use proprietary vendor-lock solutions. Why not using Windows if you don?t care for that.
                      Mmmm I'm going to use what ever solution performs the best for what i need it todo... open or closed...

                      Comment

                      Working...
                      X