Linux 2.6.38 To Linux 3.2 Nouveau DRM Benchmarks
Phoronix: Linux 2.6.38 To Linux 3.2 Nouveau DRM Benchmarks
Earlier this month I showed the Intel graphics performance hasn't improved much in the Linux 3.2 kernel (but there might be a boost when RC6 is flipped on), but how is this new kernel shaping up for NVIDIA hardware owners wishing to use the open-source and reverse-engineered Nouveau driver? In this article are some benchmarks of the Nouveau DRM driver from recent Linux releases.
Why is the graphics memory clockspeed different between Linux 2.6 and Linux 3?
It's possible that we fixed the code that reads out the current clock speed somewhere along the way. Which is the boot clock speed. And we don't change these yet automatically, on any card.
Originally Posted by Shining Arcanine
Or the clock speeds are just some random values, because in that huge nouveau vs. NVIDIA comparison I only see 1 single set of clock speeds, which certainly doesn't reflect the speeds nouveau is running them at and hence totally misleading.
No idea why nexuiz wouldn't work, it's fine on my nv96 and nvc0, so, must be your bad karma from completely neglecting clock speeds in your comparisons. Though it would be fairer if all apps refused to run this way ...
I think it speaks volumes about the intelligence, determination, and tenacity of the Nouveau developers that they've brought the code this far without the help or blessing of nVidia. Prometheus would be proud, if he had been real.
I still mention to people that they should use Nouveau if they already own Nvidia hardware, but that they shouldn't buy new Nvidia hardware.
Thanks to Nvidia being absolutely no help at all to open source (in fact, they try to set the progress of Nouveau back as much as they can), new hardware performs badly or not at all in many cases for many months or even a year or so.
Ideally, people would use this the way an mp3 decoder should be used. to play back files that were unfortunately only available in that format, but not to create new ones.
AMD only has a few people working on the open source driver stuff and puts out documentation, but Nvidia tries to hide every single operation of their cards past the basic VESA stuff they more or less have to work with. It's pathetic. Their excuse for this behavior is something that makes my blood boil even more. Their precious imaginary property.
I dunno. Nvidia seems pretty competent on linux in my experience. Not as good as Windows of course, but you could say that about just about everything in linux.
That's proper trolling I guess.
Originally Posted by johnc
Nvidia doesn't support Linux, they support Nvidia. They happen to have a Linux driver, for which they use the incorrect term "UNIX" which should tell you where they're coming from. As with all proprietary blobs from any company, theirs taints your kernel, can do anything from crash to contain backdoors into your system, to inadvertently causing security problems that only they can fix. They may also drop all of their Linux drivers or any hardware supported by them at any point in time and if it wasn't for Nouveau, which gets no help from Nvidia, those cards would never work again.
Nouveau is also interested in improving the performance of older cards that Nvidia stopped caring about a long time ago and put on life support.
If you buy Nvidia, or from any company that has no official open source drivers for the product, and want to claim they support open source operating systems, I'll sell you the Brooklyn Bridge for two dollars.
Overall I'm pretty satisfied with my nvidia linux experience. They probably have the best video drivers out there for the platform.
Originally Posted by DaemonFC
I look at it this way, if I wanted to depend on nonfree software for basic system functionality, I'd just be a surrender monkey and use Windows. But the thought of using Windows is less attractive to me than throwing my entire machine into a dumpster and going back to books and typewriters. At least those can't simply refuse to work for no other reason than you haven't paid them lately.
Originally Posted by johnc
Nvidia's blob is more or less forced obsolescence even though it is technically true that they haven't dropped their old hardware. It never gets improved in their official driver either even when it could do more than it does now. They don't want you to have neat new features that their hardware is capable of when they can just point to the latest thing they have on the shelf.
I don't like being so easily manipulated. But if you like being a surrender monkey who will eventually be bit in the ass by this horrid company, then it's your money and your system, but I refuse to turn a blind eye to what they do.
I recently had the pleasure to "work" with a rather high-end laptop with nvidia. The task was just to display something on a rotated external screen. First, nvidia-settings didn't recognize the external screen (hdmi), but after disconnecting and connecting it again and restarting nvidia-settings several times it got detected. Then, nvidia-settings doesn't support rotation and so you need two different tools to configure your screen: first nvidia-settings to enable twinview, then xrandr to rotate. Then, in twinview mode, you can only rotate both screens at once. Well, I didn't need the internal screen, but it still sucks. Then, randomly after rotating the output of gnome (it was an ubuntu 10.10 I think) had rendering artifacts. Then I would rotate it back to normal and rotate it again and they were gone. But then I tried to rotate the screen one time and simply typing and pressing enter crashed X.
Originally Posted by johnc
I'm not even speaking of the application that I was supposed to run that renders fine the first time I start it but when I start it the second time its rendering is horribly broken (just some opengl and glsl)...
That was a really fun experience. At least fglrx has xrandr support that works well enaugh. And with the Open Source radeon driver there is of course no problem whatsoever.