Announcement

Collapse
No announcement yet.

PhysX SDK Support Comes Back To Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • nanonyme
    replied
    Well, I suppose it might, as he said, make sense not to assume SSE exists on other platforms like eg PowerPC. (which uses Altivec instead of SSE)

    Leave a comment:


  • energyman
    replied
    makes sense. Because Crysis runs so well on Pentium MMX.

    ...

    Nvidia is just a bag of lying shit.

    Leave a comment:


  • deanjo
    replied
    Originally posted by Nille View Post
    Sources please.
    Here you go:

    http://www.tgdaily.com/hardware-feat...pu-performance

    Leave a comment:


  • Nille
    replied
    Originally posted by md1032 View Post
    nvidia pointed out that it was up to the application to use more than one core, and that game developers specifically asked them to make sure it worked on CPUs without SIMD instructions.
    Sources please.

    Leave a comment:


  • md1032
    replied
    Originally posted by blackshard View Post
    Back to PhysX and for all PhysX aficionados, it is proved that use no more than 1 cpu/core (bad) and use no SIMDs at all (very bad):

    http://www.realworldtech.com/page.cf...0510142143&p=4

    The most prominent reason is to let nvidia graphics card shine against full software mode.
    nvidia pointed out that it was up to the application to use more than one core, and that game developers specifically asked them to make sure it worked on CPUs without SIMD instructions.

    Leave a comment:


  • deanjo
    replied
    More references to it.

    http://www.theinquirer.net/inquirer/...-havok-physics

    http://www.xbitlabs.com/news/multime...echnology.html

    Leave a comment:


  • deanjo
    replied
    Originally posted by blackshard View Post
    mmmh, I don't know what Intel thinks about AMD accelerating its software (Havok is Intel property):

    http://en.wikipedia.org/wiki/Havok_%28software%29
    AMD licensed Havoc quite a few years ago already from intel.

    Leave a comment:


  • blackiwid
    replied
    so if you think you will develop programms only for nvidia (linux) users and don?t care about negative social effects that you produce. Go on, free software is about freedom not about profit or the better solution. In mid-/long term mostly that produces the better solution, too. And mostly in long term the other solutions die. If you want to rip-off some people with such proprietary solutions go on, but don?t hope that the biggest part of linux users will buy use your software. Because even the opensource guys mostly got the idea behind free software and they make maybe compromisses (that are bad in my opinion) but as soon as there is a other programm that do the same or nearly the same than your solution most linuxers will go on using that over your solution.

    So freesoftware or opensource is giving and taking, not forcing people to a vendor. So yes I don?t tell you what to do, but I say what I think about it.

    But even without this free software idealogies -> I <- don?t get it to use a solution that works only with cards from one vendor either this vendor has 30, 40 oder 60% market share.

    Leave a comment:


  • blackshard
    replied
    mmmh, I don't know what Intel thinks about AMD accelerating its software (Havok is Intel property):

    http://en.wikipedia.org/wiki/Havok_%28software%29

    Originally posted by deanjo View Post
    See my above comment about bullet. Bottom line is the more complex the scene gets the faster you approach the CPU's limitations.
    Ok, I agree. But since we have powerful CPUs with 4 cores, and actually games barely use 2 of them, and since physics is a perfect streaming application, couldn't be a good idea to use SIMDs and parallel execution on actual x86 multicore cpu's?

    Back to PhysX and for all PhysX aficionados, it is proved that use no more than 1 cpu/core (bad) and use no SIMDs at all (very bad):

    http://www.realworldtech.com/page.cf...0510142143&p=4

    The most prominent reason is to let nvidia graphics card shine against full software mode.

    Leave a comment:


  • deanjo
    replied
    Originally posted by blackshard View Post
    http://www.bit-tech.net/news/hardwar...ysics-at-gdc/1


    See my above comment about bullet. Bottom line is the more complex the scene gets the faster you approach the CPU's limitations.

    Leave a comment:

Working...
X