The Open-Source Linux Graphics Card Showdown
Phoronix: The Open-Source Linux Graphics Card Showdown
Earlier this week I provided Intel Core i7 3770K Linux benchmarks for the Ivy Bridge launch-day followed by initial Ivy Bridge Linux HD 4000 graphics benchmarks compared to the Intel HD 2000/3000 Sandy Bridge graphics under Linux and to AMD Fusion on Catalyst and Gallium3D. In this article are more benchmarks of the HD 4000 Ivy Bridge graphics under Linux with Intel's open-source driver, but in this article it is a much larger comparison. This is a full showdown of the Core i7 3770K graphics compared to several discrete NVIDIA GeForce and AMD Radeon graphics cards when they're using their respective open-source Gallium3D drivers. What graphics hardware is best if you want to use an open-source GPU driver? Find out now.
Overall a pretty poor showing by the radeon drivers
Hate to say it, but there's no way Ivy Bridge should be putting up comparable numbers.
I wish there were 20 more developers working on optimizing it.
I also wish there were a few more tests here with heavier shader usage - WINE, Unigine, etc. Ivy Bridge is limited by the shader power it has, so simple Quake 3 games don't tell us much.
Nice test, but I think it's near impossible to tell which curve corresponds to which graphics card in those graphs. The colors are way too similar.
I didn't read all the text so maybe you mentioned it, but why only low and mid range cards? Would've been interesting to see some ATI HD 7870 or even 79x0 (or similar from previous generation) since it's games benchmarking
radeonsi is not ready, so 78X0 and 79X0 don't have a free open source driver to test.
Originally Posted by johanar
Thanks for the test, just yesterday I was searching for those numbers intel vs. radeon vs. nouveau
It's nice to see the radeon driver win in some of the tests (even though power consumption is no good).
For me this looks like described in this comic:
at least I got the hardware already...
Finally some Doom 3 results and line graphs that don't require a ruler and a calculator to read! So why leave out the HD6550D?
Will we ever see any closed source games being tested? How about Prey? IIRC it does have a demo mode.
When you finally get an A10-5800K you'll need to test it against the i7-3770K A8-3870K and the i5-2500K else it's not really a fair test of the new generation and the generation it's replacing.
Also invest in some good ram, overclocking tests on these CPU graphics systems would be very welcome.
Thanks, these are some good tests with the tweaked settings.
I'll too echo the request for a re-test of the Llano with it's high power mode.
Some AMD fans don't care about actual test results. If Michael tests with default settings, they cry, "Turn on all unstable beta features!". If Michael tweaks xorg.conf, amd fans cry, "Test better amd cards!". If Michael puts better cards, fans cry, "There is a buggy git branch, that may boost perfomance. Test with it! It won't be enabled by default for the next 5 years, but who cares? " Seems like all they need is an excuse to see amd winning.
Maybe Michael should make up results where radeon wins by 10x times? ;D
Anyway, good job, Intel guys! You've made the best Linux graphic driver! It's open-source, fast and takes full advantage of hardware!
My next system will be based on Ivy Bridge. I don't play heavy 3D games, all I need is a system with decent 2D perfomance and good single-threaded capabilities that would be useful for the next 10 years.
Actually since it was a general OSS GPU driver test including higher end cards from AMD and Nvidia makes sense, are they CPU limited? how much difference does PCIe 2.0 make etc...
Originally Posted by rhier
On the other hand, the HD6550D is the GPU side of the desktop Llano A8 series APU, its in the same class as the other APUs and CPUs with attached GPU, I specifically picked the fastest GPU equipped models, both the Sandy Bridge i5-2500K/Intel HD Graphics 3000 and Llano A8-3870/Radeon HD6550D of the previous generation and the new Ivy Bridge i7-3770k/Intel HD Graphics 4000 and Trinity A10-5800K/Radeon HD7660D of the current generation.
Do you not think that testing the best of the previous generation against the best of the current generation directly would make for much more interesting and relevant results?
The vast majority of users will only ever use the GPU built into the system and never buy a dedicated GPU card.
Last edited by Kivada; 04-27-2012 at 05:38 AM.