Originally posted by Sonadow
View Post
Announcement
Collapse
No announcement yet.
Nouveau Driver Remains Much Slower Than NVIDIA's Official Driver
Collapse
X
-
Originally posted by liam View PostPerhaps I missed something but why doesn't Michael run the Nouveau drivers at their highest speeds when possible?
Originally posted by liam View PostI've a Quadro FX 570M that has three clock levels. The highest level is core 475 shader 950 memory 700, while stock level is 275, 550, 300, respectively. A rather massive difference.
If used this for gaming I'd always put it at the highest speed levels. I'd imagine most gamers would do the same.
I note that your Quadro FX 570M has an older NV84 (G84) chipset, while the 9800 GT and 9800 GTX have the NV92 (G92) chipset. The GT 220 has the NVA5 (GT216) chipset. If you have been successful in reclocking the Quadro FX 570M please let us know which kernel version you are using as this would be good news and might also be useful to someone else.
If Michael had tested with a GT 240 (I can't remember if he owns one), which has the NVA3 (GT215) chipset, things might have been different, as better NVA3 memory re-clocking was introduced in the 3.5 kernel. I'm not sure how much difference that would make without also reclocking the core though.
Comment
-
Originally posted by asdxNouveau will have optimus support in distros *first* and by *default* real soon, and Nvidia won't.
Ha ha, take that Nvidia.
And Nvidia has the resources to re-implement a DMA-BUF solution into the blob if they want to. And they might just do that, which will make DMA-BUF irrelevant.
And Nvidia has supported Optimus on Windows for...what, 3 years? While Linux still hasn't even got it? Oh dear...Last edited by Sonadow; 05 January 2013, 05:08 AM.
Comment
-
Originally posted by asdxNouveau will have optimus support in distros *first* and by *default* real soon, and Nvidia won't.
Ha ha, take that Nvidia.
Honestyl spoken,imo, optimus is useless with nouveau, cause the performance gain using the noveau driver+nvidia gfx insteadof the intel gfx is likely to low for thinking about gaming with it .
Comment
-
Originally posted by christian_frank View PostI wouldn't bet my ass on that. Nvidia is working hard for some month already to get optimus working in their blob.
Honestyl spoken,imo, optimus is useless with nouveau, cause the performance gain using the noveau driver+nvidia gfx insteadof the intel gfx is likely to low for thinking about gaming with it .
If memory serves, the whole point of Optimus was to have the figurative best of both worlds by wiring the display to the Intel GPU which is just good enough for everything that is not games or CAD related, and swap over to the dedicated chip when more crunching power is needed.
With the Nouveau driver not being close to delivery the expected performance from the Nvidia chip it makes little sense to even bother with switching if the performance gains do not match up with the increase in power consumption.
Comment
-
Originally posted by gens View Post8800 was a revolution in gpu design, at least as far as nvidia is concerned
Originally posted by gens View Postnext step was in the 400 series
next step is in the 600 series, and only the high end part
but nothing revolutionary in them
Originally posted by gens View Postalso the low end cards in the newer series are utter crap
The point is that doing a "benchmark" of 5+ year old GPUs is a pretty daft idea in the first place. Nobody cares if you happen to think it's the best thing evarrr.
Comment
-
Originally posted by netrage View PostThe steam engine was a revolution in engineering, but I don't think many use them anymore..
Comment
-
Originally posted by thofke View PostAs far as I know, almost all power plants use steam engines... From coal to atomic energy. There are of course some exceptions, for instance the natural gas-driven otto engines which drive the dynamos directly.
Comment
-
Originally posted by GT220 View PostNouveau developers are retarded, trying to reinvent the wheel, wasting valuable resource and time instead of working on other parts of Linux that needs improvement.
Time to take this garbage out and throw it into the dustbin of history where it belongs.
You HAVE to have control over your hardware. It wasn't just once that people have found hairraising covert security holes in binary blobs, and nVidia has been mentioned more than once in that context.
While I understand some of the woes at open source effort, I have to acknowledge difficulty of working with so cpmplex stuff under conditions of cronic and critical lack of documentation all the time.
Just look at AMD's documentation ( since they claim publically support for open source and open documentation) and try to make head and tails out of that.
nVidia and AMD get plenty of cash from hardvare. Compared to that, Open-source developers effort lives almost from public mercy.
And still ( while I don't know aboot Nouveau ) open source radeon driver is not half bad. Eyefinity works for me superbly ( 3 x 1600x1200 ). 3D is not that bad. 2D is great. It works without a problem on all kernels I cared to try, even the freshest ones, compiled straight a few hours after release or git copies. ONly thing that is still missing for me on radeon is good GPGPU interface.
Not so with closed-source fglrx. It's full of bugs, some of them persist through many versions over multiple year timespan and quite annoying. It is not always easy ( or even possible) to nail right combination of kernel, driver, libraries and xorg server to make it work.
I switched recently from radeon to fglrx only to get choked on bugs and instabilities. For the moment I am persisting as I want to play with GPGPU stuff, but at very first day radeon includes support for it, I am switching back.Last edited by Brane215; 05 January 2013, 08:39 AM.
Comment
Comment