Radeon suck. All versions except 7000 (that we don't now for sure), they have bad OGL (so many errors), bad offload for CPU (they need more CPU performance), bad code morphing and bit manipulation (not good or many emulation instructions). Personal experience: RIFT with Wine, Core2duo SSSE3 CPU@3ghz, Radeon4670@500gflops: 15fps(low quality renderer), 18fps(hight quality renderer - low settings). With an Nvidia@500gflops its 25+fps. Nvidia loads more graphics, wile CPU is at the full on both.
It is nobody's fault but your own if YOU are too STUPID to use the proper RADEON driver, not the blob shit, whether AMD or nvidia.
Originally Posted by crazycheese
Actually somebody is at fault, it should "just work". Since it's not and the user managed to use the "wrong" driver, somebody in the chain that provided him with the driver is at fault.
First your ati experinces are very old, they are not what the amd blob is today like somebody posted...
Originally Posted by boast
But my main point is, that assumption can only than be true if you really dont care at all if you have a free or a nonfree driver. And its not only a question of philosophy or so, I as example have installed the latest alpha from ubuntu but even after that are betas, and there are other linuxes with unstable versions too, so I can be shure zu 99,999% that the grafic driver with the kernel even if its not a "stable" kernel will work, on the other hand there is a big change that a blob will not work because of X or the kernel... even with stable ubuntu version you sometimes dont get a good experince with the blobs... then you maybe have to wait a few months on one api change I think nvidia and amd chocked nearly a year or so before they fixed it. that can happen again in the future...
And then you have to use your card only a few years and then buy a new one, because at some point they stopp supporting them and if you then update your linux they will not work... they even can say tomorow, because linus insultet us so much we stop linux support at all, and after the next api change in X you cant use their drivers or newer distries at all...
So you cant just feature compare free drivers vs non free drivers, you have to see that free drivers is a feature itself...
you now maybe will say thats not that so important than that I cannot play games, thats maybe true for you, but to do so as if a free driver is no feature at all is stupid...
Originally Posted by artivision
This can easily be fixed... And you just give an random nvidia card, but fail to provide the nvidia graphics card information. Anyways, the real reason the graphics card you mention fails on linux to perform just as well is that there is not enough end users who provide viable information to the developer.
Yet again, the issue is not the binary driver. It's also neither AMD nor nvidia... It's all about how many users are able to return reasonable bug reports that will enable the developer ( AMD, NVIDIA, and whoever else writes drivers) to be able to fix the bugs.
Originally Posted by droidhacker
The only person who can be best described as at fault would be the beta testers and end users for being unable to write good bug reports. The developer is powerless to fix bugs unless they are given the correct information to be able to fix the bug, and it definitely helps if you can get things to where you can reproduce the bug on two or more machines.
Originally Posted by Vadi
i agree but the foss driver performance is horrible.
Originally Posted by asdx
I have tested this right now running Gnome Player with VDPAU accelerated "Solaris (1975)" under XFCE/Compiz. Its tear free.
Originally Posted by RussianNeuroMancer
The "proper RADEON" driver was doing 5 fps in QuakeArena back then.
Originally Posted by droidhacker
And right now, the "proper RADEON" driver performance makes me not want to purchase any mid-high class AMD card.
Partly because of the 98% of people who run their drivers, none of them use OpenGL. Even on Windows, the OpenGL stability of both AMD and NVIDIA is _terrible_. We shipped a game 4 months ago using OpenGL that literally crashed on every single NVIDIA driver except the very very latest, and we had to spend tons of time ripping out and rearranging bits of the graphics architecture until we found out what was causing the crash and how to get things rendering without triggering it. (Hence the interest of Valve and others in FOSS drivers.)
Originally Posted by curaga
The D3D drivers are far more stable. In this case, the stability has a bit less to do with D3D being a better API and much more to do with D3D actually being used. Recall that Linux users still frequently run non-GL-accelerated desktops, use DDX drivers and EXA/RENDER to get their basic apps on screen. OS X uses an entirely Apple-written driver architecture. Very few Windows apps use OpenGL, and even most of the ones people think use OpenGL actually use D3D: all Windows implementations of WebGL run over ANGLE, many of the "big 3D content apps" have switched over to D3D on Windows, and the games that use OpenGL are almost entirely just simple little 2D indie games that don't do anything even remotely interesting with the GPU.
Granted, Microsoft also has tests for D3D and properly designed the driver model such that every vendor didn't have to reimplement all of D3D internally, while Khronos still doesn't even offer a test suite, much less a core OpenGL framework for the ISVs to build (and even if they did, at this point the ISVs have too much invested in their internal implementations, so a switch is unlikely to happen without some strong-arming).