Announcement

Collapse
No announcement yet.

What NVIDIA's Linux Customers Want

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Shining Arcanine View Post
    There are several items that need attention that people are reporting at nvnews.net:
    • Overclocking on GeForce 8 series cards is broken
    • Twinview does not work properly with the 200 series and up drivers.
    • 2D Performance is awful on the GeForce 7 series (e.g. my GeForce Go 7900 GS)


    You could probably find more if you looked. Anyway, it seems that Nvidia's current development resources cannot support more architectures, add more features and fix all of these issues at once. With that in mind, it does not surprise me that they are asking for help.
    I have a 7900GS in my desktop and 2D/3D work fine with KDE. Maybe a bug in your system's config perhaps?

    Comment


    • #22
      Originally posted by pingufunkybeat View Post
      The regular Linux desktop users do not make Nvidia execs blink either.

      The binary drivers are there because it's relatively little work for them, since they have a cross-platform workstation driver anyway. Not because of enthusiast Linux-Wine gamers and their millions of bucks.

      As soon as anything Linux related actually requires proper Linux-related work (Optimus, xrandr, full XRender extension support, etc.) you get nothing.
      Is it really that cheap for them?
      I thought it would still be quite some work to get it to work with Xorg.

      Comment


      • #23
        Originally posted by AnonymousCoward View Post
        Is it really that cheap for them?
        I thought it would still be quite some work to get it to work with Xorg.
        Obviously only the people working on the closed driver would know this, but they actually replace most of the X that they need with their own in-house solution written for Windows.

        More than 95% of the driver is cross-platform code that has nothing to do with Linux.

        Comment


        • #24
          In the past, I always got Nvidia card because "It's the way to be played"... on Linux BUT..

          After I got my quad core Q6600 CPU (with a 8600GTS), I got a problem of X pumping 100% of one core and everything freezing.. I made some research and everyone on the Nvidia forum told me to use maxcpu=1 or nosmp as a simple fix.... (They call that a fix...)

          After this bug and another one made me lost a couple of "Internet Spaceships", I decided to buy an ATI HD4670... All problem vanish (OK... 30 minutes of geeking to get the two screens working...) and the time past...

          One years later, I decided that I want to run 3 EVE-Online client in the same time with Wine so.... I buyed a Nvidia 9600_something_big. When I got home, I found out that the card have only ONE DVI output and nothing else (no VGA port)... I did not know at that time that single head card from Nvidia was still existing so.. One day later, I switched for another 9600_something with two head...

          Know what?, The famous 100% CPU Usage for xorg bug was not corrected after that time, on the nvidia forum, they still arg me with the stupid magic nosmp boot parameter that fix everything!

          So.. 2 day later, I go the the shop with the card and I explained the problem to the guy and he exchanged the card for a ATI HD4870 for free... Of course it took some geeking to make everything work BUT I was able to use all my CPU cores And now that the S3TC hooks landed in r600g, I don't need the ATI blob anymore to play my internet spaceship game and I can use GIT everything without any blob problem!

          So no more Nvidia for a couple time (at lease, until nouveau work = or better than r600g.

          PS : A friend of me got the same problem (that can be fixed easy with the magic nosmp boot parameter) with different hardware/distribution at his work place, they remplaced all cards by ATI (on my advise) and the problem vanish!

          Comment

          Working...
          X