Phoronix: AMD Llano Graphics / Radeon HD 6620G On Linux
AMD's next-generation "Llano" Fusion APUs are launching today. Llano is a very nice upgrade over the current-generation 40nm Brazos hardware as talked about in another Phoronix article to be published in the next couple of hours, but in this article is a look at the graphics in Llano. Here's the first Linux look at the Llano graphics support and performance for the Radeon HD 6620G as found with the AMD A8-3500M Fusion APU.
Since the description says that Llano won one more test, the labels/colors must be swapped.
It would be much more interesting if there were more systems tested, namely some from intel to see how it compares. More tests wouldn't hurt either.
I believe the article does say more benches with head to head comparison is in the works but you really have to wait until the foss support is there for llano to get a fair comparison if you want an apple to apple comparison.
Again... Put the line at 30fps, not 20, not 40... As a reviewer you know full well that it's widely accepted that 30fps is the bare minimum for smooth game play, below which things get jumpy and choppy.
When can we expect to have these available on the market?
Yeah the A8 desktop models are going to be HTPC monsters! So need something like the GA-A75-UD4H running coreboot http://www.youtube.com/watch?v=L37q3OyZj8Q Everything but Unigine should run quite well on it. I just wonder how much more can be squeezed out of the GPU by OCing the system ram from the suggested 1866Mhz up to say 2.4Ghz?
Yeah, on a normal system ram increases net you maybe only 2% in overall system performance, but since the GPU looks like it'll be memory bandwidth starved, going by the performance of similar AMD GPUs with 400 shaders but with different amounts of vram bandwidth available. Namely the HD5570, which normally comes with 128-bit GDDR3, though there are crippled versions floating around with 64-bit DDR3 or 128-it DDR2. Lets not forget though that the HD5670 uses the same 400 shader core, just running a little faster and paired with MUCH faster 128-bit GDDR5 for round about effective performance of 256-bit GDDR3. https://secure.wikimedia.org/wikiped..._.28HD_5xxx.29
I've never understood why AMD/ATI and Nvidia ever allowed companies to release vram crippled versions of their cards, but then again the REALLY need to clean up the mobile GPU naming, that crap is 10 different kinds of ridiculous...