Announcement

Collapse
No announcement yet.

What NVIDIA's Linux Customers Want

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • RavFX
    replied
    In the past, I always got Nvidia card because "It's the way to be played"... on Linux BUT..

    After I got my quad core Q6600 CPU (with a 8600GTS), I got a problem of X pumping 100% of one core and everything freezing.. I made some research and everyone on the Nvidia forum told me to use maxcpu=1 or nosmp as a simple fix.... (They call that a fix...)

    After this bug and another one made me lost a couple of "Internet Spaceships", I decided to buy an ATI HD4670... All problem vanish (OK... 30 minutes of geeking to get the two screens working...) and the time past...

    One years later, I decided that I want to run 3 EVE-Online client in the same time with Wine so.... I buyed a Nvidia 9600_something_big. When I got home, I found out that the card have only ONE DVI output and nothing else (no VGA port)... I did not know at that time that single head card from Nvidia was still existing so.. One day later, I switched for another 9600_something with two head...

    Know what?, The famous 100% CPU Usage for xorg bug was not corrected after that time, on the nvidia forum, they still arg me with the stupid magic nosmp boot parameter that fix everything!

    So.. 2 day later, I go the the shop with the card and I explained the problem to the guy and he exchanged the card for a ATI HD4870 for free... Of course it took some geeking to make everything work BUT I was able to use all my CPU cores And now that the S3TC hooks landed in r600g, I don't need the ATI blob anymore to play my internet spaceship game and I can use GIT everything without any blob problem!

    So no more Nvidia for a couple time (at lease, until nouveau work = or better than r600g.

    PS : A friend of me got the same problem (that can be fixed easy with the magic nosmp boot parameter) with different hardware/distribution at his work place, they remplaced all cards by ATI (on my advise) and the problem vanish!

    Leave a comment:


  • pingufunkybeat
    replied
    Originally posted by AnonymousCoward View Post
    Is it really that cheap for them?
    I thought it would still be quite some work to get it to work with Xorg.
    Obviously only the people working on the closed driver would know this, but they actually replace most of the X that they need with their own in-house solution written for Windows.

    More than 95% of the driver is cross-platform code that has nothing to do with Linux.

    Leave a comment:


  • AnonymousCoward
    replied
    Originally posted by pingufunkybeat View Post
    The regular Linux desktop users do not make Nvidia execs blink either.

    The binary drivers are there because it's relatively little work for them, since they have a cross-platform workstation driver anyway. Not because of enthusiast Linux-Wine gamers and their millions of bucks.

    As soon as anything Linux related actually requires proper Linux-related work (Optimus, xrandr, full XRender extension support, etc.) you get nothing.
    Is it really that cheap for them?
    I thought it would still be quite some work to get it to work with Xorg.

    Leave a comment:


  • DeepDayze
    replied
    Originally posted by Shining Arcanine View Post
    There are several items that need attention that people are reporting at nvnews.net:
    • Overclocking on GeForce 8 series cards is broken
    • Twinview does not work properly with the 200 series and up drivers.
    • 2D Performance is awful on the GeForce 7 series (e.g. my GeForce Go 7900 GS)


    You could probably find more if you looked. Anyway, it seems that Nvidia's current development resources cannot support more architectures, add more features and fix all of these issues at once. With that in mind, it does not surprise me that they are asking for help.
    I have a 7900GS in my desktop and 2D/3D work fine with KDE. Maybe a bug in your system's config perhaps?

    Leave a comment:


  • pingufunkybeat
    replied
    Originally posted by AnonymousCoward View Post
    As much as I agree with you, I'm pretty sure the few customers that think like that won't even make one Nvidia exec blink...
    The regular Linux desktop users do not make Nvidia execs blink either.

    The binary drivers are there because it's relatively little work for them, since they have a cross-platform workstation driver anyway. Not because of enthusiast Linux-Wine gamers and their millions of bucks.

    As soon as anything Linux related actually requires proper Linux-related work (Optimus, xrandr, full XRender extension support, etc.) you get nothing.

    Leave a comment:


  • pingufunkybeat
    replied
    Originally posted by Thunderbird View Post
    Optimus support will likely never come to the blob. Essentially it would require Nvidia to integrate an Intel driver into the blob. Since the way Optimus works is that the Nvidia GPU does all its rendering to the framebuffer of the Intel GPU (after a frame is done a memcpy to the Intel framebuffer happens). The Intel GPU does all the presentation of the to the main display, so a full Intel 2D driver including mode switching all the other hell is needed.
    This would be SO easy with open drivers. They would share most of the infrastructure, so no problem at all.

    And people actually claim that a closed source solution is technically better. If it makes it impossible to write drivers, then it's not better.

    Leave a comment:


  • testerus
    replied
    Originally posted by Qaridarium
    in my point of view paying 2-3 opensource devs is very cheap for a company like nividia.
    They would sacrifice their premium business cards that differentiate from consumer hardware through special drivers.

    Leave a comment:


  • SciFiDude79
    replied
    Just give me a good open source driver. I hate going to test out a Linux distro in Live mode and the screen looks all wonky or the desktop doesn't work right because I have to install a proprietary driver, something you can't do without installing the OS, which kind of defeats the purpose of Live distros. Once you get the propreitary drivers installed, Nvidia cards rock, (except on KDE, but that's as much KDE's bad as it is Nvidia's) but before that they suck. Give me an open source driver with 3D support and that works as well as a proprietary driver that can be used to truly test a Linux distro before you install it and I'll be happy. You can keep some of those other features, IMO.

    Leave a comment:


  • testerus
    replied
    2D performance still sucks.
    Just try to play freeciv_gtk with the nvidia driver. Nouveau is so much better there.

    Leave a comment:


  • mikus
    replied
    I would kill to see multi-monitor support work finally. I've been using 4 monitors for years and honestly all the vendor support sucks for this, as neither ATI or NVidia can do compositing across all of them. Using more than 1 monitor kills SLI capability (?!), can only composite render across 2, and anymore requires Xinerama to do it's dirty work. I really can't find a straight answer that says 3+ monitor compositing is a limitation in xorg or the hardware, other than that work toward Wayland recently to resolve xorg limitations in the windowing system (which nvidia has said they WON'T support - boo/hiss...).

    I bought an ATI Eyefinity 6 finally out of frustration with Nvidia as most people say it can do 3+ monitors "as one", but still only using Xinerama that kills compositing support. Plus otherwise there were so many bugs in just using 4 monitors Xinerama'd that it was almost unusable with artifacting and tearing (of which their 2gb card wouldn't even let me enable "tear free" features for "lack of memory"(?)). At least using 2 nvidia cards I could get reasonable stability and performance (old gtx7950's) at the cost of full desktop compositing (which I miss compiz terribly), so I'm just going to exchange it for 2 newer, higher-end nvidia cards. Completely worthless ATI Eyefinity support STILL under linux.

    Is it really too much to ask for compositing across more than 2 monitors under Linux from either vendor? I'd even take a crap Intel GPU if it could just do decent multi-monitor desktop compositing...

    Leave a comment:

Working...
X