NVIDIA sent over a Palit GeForce GTX 460 768MB graphics card for our initial Fermi Linux testing.
Announcement
Collapse
No announcement yet.
NVIDIA GeForce GTX 460 On Linux
Collapse
X
-
Originally posted by deanjo View PostWell it was still nice to see that happen for a change.
Originally posted by deanjo View PostWhat's up with your AMD guys nowdays? They still haven't sent you a 5xxx series card yet?Michael Larabel
https://www.michaellarabel.com/
Comment
-
-
Nice article and good to see some benchmarks of Fermi on Linux.
Personally I have a GTX 470 and am very happy with it. Yes, it gets quite hot while gaming but the sound levels are very normal. Only when stressing it in benchmarks like Heaven, it gets audible.
I overclocked my card with a BIOS flash, running @ 710/1700 for a while now and never had any stability problems. Too bad Coolbits isn't working yet since it was also my method of choice and I was surprised when I popped in my new card that the option disappeared.
However I must disagree about one point with the review: you say overclocking is the only thing not working at this point and that the driver further is up to par with its Windows counterpart. What about enabling SLI options in the GUI and adding support for SLI profiles? Some people like to say you don't need 2 GPUs for Linux to max out the games, but Unigine games are coming, Natural Selection 2 will be coming when Steam would ever see a Linux port and let's don't forget about Rage and Doom 4. And then there is also the people who use SLI for CUDA.
The time has come to treat Linux like an equally supported platform and give us the same benefits like Windows with a more complete GUI, SLI profile updates and SLI profile customization options.
Comment
-
Originally posted by Ulukai View PostSome people like to say you don't need 2 GPUs for Linux to max out the games, but Unigine games are coming, Natural Selection 2 will be coming when Steam would ever see a Linux port and let's don't forget about Rage and Doom 4. And then there is also the people who use SLI for CUDA.
Comment
-
Originally posted by Apopas View PostWhy didn't you try Radeon 5770 in the last two tests? As far as I know the 5000 series do an amazing job in anything that has to do with heat and power consumption.Michael Larabel
https://www.michaellarabel.com/
Comment
-
In my opinion it would be better practice to mature the GUI + profiles for SLI BEFORE everyone really wants to use it. When the work is only half done and untested when games will be needing SLI for better results, and people start trying it, we're not better off.
Don't get me wrong, I'm not bashing Nvidia in any way because I swear by Linux + Nvidia for graphics and they are doing a great job supporting every new kernel and xorg really fast. Also everything is documented very well on their site and in the driver release notes. But they claim to support SLI for Linux on their website and it's been around for YEARS now but still the support is of a complete different level than it is for Windows. You can ony enable or disable it in xorg, no further config, only 1 screen, only 2 cards, ... I was just expecting a bit more of that.
A few weeks back I posted about this issue on the NVnews forums, asking for the future plans for Linux SLI and asking the opinion of an official dev (who hang around there all the time) but not one would take 1 minute to answer my post. In general they are very helpful however, but I can't help thinking there are some subjects they rather don't answer. Don't ask me why...
Comment
-
Originally posted by Ulukai View PostA few weeks back I posted about this issue on the NVnews forums, asking for the future plans for Linux SLI and asking the opinion of an official dev (who hang around there all the time) but not one would take 1 minute to answer my post. In general they are very helpful however, but I can't help thinking there are some subjects they rather don't answer. Don't ask me why...
Comment
Comment