Announcement

Collapse
No announcement yet.

PhysX SDK Support Comes Back To Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • PhysX SDK Support Comes Back To Linux

    Phoronix: PhysX SDK Support Comes Back To Linux

    Back in 2006 a start-up company known as AGEIA launched the PhysX PPU, the first Physics Processing Unit (PPU) for offloading physics calculations in games and applications that utilize the PhysX API onto this discrete processor for boosting overall system performance...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    I hope only few uses this proprietary technology

    I hope not many people make software for Linux with that, because it is unfree and you need an unfree driver also you can only use it with Nvidia grafic cards.

    So to use this under Linux, you must be one of the few, who use Linux, who have a newer Nvidia Card and use the binary Nvidia drivers. I think that makes not much sense. Why using Linux if you like proprietary vendor-specific solutions, I don?t get it?

    I usuly like it when companys support Linux, but not if they only try to force you to buy their (and nothing from a other grafics competitor) hardware to use it.

    With Intel selling the most grafic-chips and AMD on second place, and Amd selling most grafic-cards I don?t hope/see that nvidia can hold physix alive in mid/long term.

    Nvidia go to Gnu.org or even opensource.org read there for 15 minutes whats the Idea/thougts behind Linux is, and come than back with open drivers/specs and other open stuff.

    Comment


    • #3
      Originally posted by blackiwid View Post
      I don?t hope/see that nvidia can hold physix alive in mid/long term.
      This is an interesting thought, because it is strongly related to the *real* reason behind this move from nvidia... and believe me, they aren't interested in supporting open source or doing ANYTHING that will make your life easier.

      Basically, nvidia is watching the market and beginning to realize that they are slowly getting squeezed out. The most lucrative part of the GPU business is selling the low to mid-range parts in HUGE VOLUMES. The high-end parts may carry a hefty price tag, but there isn't enough VOLUME there to amount to much. As both intel and amd move toward combined CPU+GPU=APU components, this will effectively cut nvidia out of this most lucrative part of the business, so HOW will nvidia cope? The way they have chosen is by trying to get their proprietary tentacles into everything (CUDA, PHYSX, VDPAU, etc.).... If developers USE these proprietary nvidia tentacles, then it is nvidia's HOPE that it results in VENDOR LOCK-IN, which would necessitate the use of a discrete nvidia graphics card. In my opinion, this is the only way that nvidia CAN continue since they don't have much of an x86 CPU to compete with the likes of intel or amd. In other words, they SIMPLY CAN'T SURVIVE in a world dominated by APUs unless there is *some reason* why the average person NEEDS something from them.

      So out comes support for their proprietary junk that nobody needs *now*, hoping that *everybody* will need before its too late.

      Comment


      • #4
        VDPAU is not proprietary. Also, combined CPU+GPU chips can never perform well; they'll barely have enough graphics horsepower to draw the screen, let alone process physics.

        Comment


        • #5
          Originally posted by droidhacker View Post
          This is an interesting thought, because it is strongly related to the *real* reason behind this move from nvidia... and believe me, they aren't interested in supporting open source or doing ANYTHING that will make your life easier.

          Basically, nvidia is watching the market and beginning to realize that they are slowly getting squeezed out. The most lucrative part of the GPU business is selling the low to mid-range parts in HUGE VOLUMES. The high-end parts may carry a hefty price tag, but there isn't enough VOLUME there to amount to much. As both intel and amd move toward combined CPU+GPU=APU components, this will effectively cut nvidia out of this most lucrative part of the business, so HOW will nvidia cope? The way they have chosen is by trying to get their proprietary tentacles into everything (CUDA, PHYSX, VDPAU, etc.).... If developers USE these proprietary nvidia tentacles, then it is nvidia's HOPE that it results in VENDOR LOCK-IN, which would necessitate the use of a discrete nvidia graphics card. In my opinion, this is the only way that nvidia CAN continue since they don't have much of an x86 CPU to compete with the likes of intel or amd. In other words, they SIMPLY CAN'T SURVIVE in a world dominated by APUs unless there is *some reason* why the average person NEEDS something from them.

          So out comes support for their proprietary junk that nobody needs *now*, hoping that *everybody* will need before its too late.
          I thought at one time Nvidia offered to let AMD use CUDA on their GPUs as well. I don't know if they wanted anything from AMD in exchange or if the offer is still open.

          Comment


          • #6
            Originally posted by blackiwid View Post
            So to use this under Linux, you must be one of the few, who use Linux, who have a newer Nvidia Card and use the binary Nvidia drivers.
            That pretty much accounts for the largest hardware combination of linux users.

            Comment


            • #7
              I doubt that, the most combination are shurely Netbooks or Notebooks with Intel gpus.

              And at the moment AMD sells more grafic-cards than Nvidia, I think the most Linux-users don?t buy grafic-cards only for Linux, I give you that in the past Nvidia was the best option for Linux, but the proprietary AMD drivers got much love against the ATI time, and the open ones are getting better day by day.

              So if you start today a Game or Programm that uses Physix you cannot only look at the current state you have to look into the future, and even if you ignore integrated solutions, Nvidia looses its market domination on the hole market and I think It would be strange to think that under Linux that trend will be completly ignored.

              And again even If you would be right and there would be enough nvidia cards to programm espacily for them software under Linux, why would you wanna use Linux when you want to use proprietary vendor-lock solutions. Why not using Windows if you don?t care for that.

              Comment


              • #8
                So the SDK is back AND they put out a 64-bit version too this time. Good news for anyone developing on Linux who's interested in implementing Physx. Now the question is whether they'll ever put out a hardware driver for Physx on Linux.

                Comment


                • #9
                  Originally posted by md1032 View Post
                  VDPAU is not proprietary. Also, combined CPU+GPU chips can never perform well; they'll barely have enough graphics horsepower to draw the screen, let alone process physics.
                  VDPAU is most DEFINITELY proprietary. They may have released an API, but they have NOT released the necessary HARDWARE PROGRAMMING DOCUMENTATION to actually implement it. As long as the secret sauce in their blob remains secret, VDPAU remains proprietary.

                  As for APU's not having enough graphics horsepower... To do what exactly? All that they *NEED* to do to virtually WIPE NVIDIA OUT is DRAW THE SCREEN. To top that though, have you actually TRIED an AMD APU?

                  As for processing physics... you're missing the point. You don't NEED to process "physics". Especially not in the nvidia way! This is what we want to AVOID and is what they are TRYING to lock us in to. Either way though, its use will remain a fringe market.

                  BTW: AMD's APUs are looking like they will seriously beat intel's parts to death -- at least in graphics power.... http://www.xbitlabs.com/news/cpu/dis...rocessors.html

                  Comment


                  • #10
                    Originally posted by pvtcupcakes View Post
                    I thought at one time Nvidia offered to let AMD use CUDA on their GPUs as well. I don't know if they wanted anything from AMD in exchange or if the offer is still open.
                    Yeah, all they wanted in exchange for it was AMD's silicon specs... so that nvidia could build a blob to run it on AMD chips at a far lower level of performance than it runs on their own (and steal all the competitive advantages AMD has at the same time).

                    Sound like a good deal to you?

                    Comment

                    Working...
                    X