NVIDIA GeForce GT 220
Phoronix: NVIDIA GeForce GT 220
Days prior to AMD's release of the ATI Radeon HD 5750 and Radeon HD 5770 graphics cards, NVIDIA released their GeForce G 210 and GeForce GT 220 graphics cards. Both of these NVIDIA graphics cards are for low-end desktop systems, but part of what makes them interesting is that they are the first NVIDIA GPUs built upon a TSMC 40nm process. To Linux users these graphics cards are also interesting in that they fully support all of the current features of VDPAU for Linux video decoding, including MPEG-4 support. We picked up an XFX GT220XZNF2 GeForce GT 220 1GB graphics card for this round of benchmarking on Ubuntu Linux.
Crap performance + no open driver: business as usual for NVidia.
What do graphs on fifth site show? For me they show nothing. Scale on Y axis is wrong.
Crap performance? This is juz a bottom end card. Look at the other Nvidia solutions that are compared to it.
Originally Posted by remm
These are only how old? Ill give you a hand, Geforce 8 is November 8, 2006.
Business as usual for nvidia? More like business 3 years old.
To compare, the R700 is June 25, 2008. You are looking at what is a 2 year difference between the nvidia offerings and ATI ones. Doesnt seem like much when you are purchasing an oven but in this game it certainly does. This is just a crap offering by nvidia, that seems to be targeted towards HTPC's. I think this deals more of a blow to S3 then anything else.
Finally an ATI vs. NVIDIA benchmark. I've been waiting for such a benchmark on Phoronix for as long as I know about Phoronix.
Sorry, but you need to scale them down to 0-20% to show the difference, current scaling 0-100 shows absolutely nothing ... havent you seen that before posting?
Low end, performance of cards 2 generations behind, 40nm process, and they STILL need a fan on it? What market segment is that possibly going to fill, the deaf HTPC user?
In a way, this is correct. The difference between 5% and 10% CPU load is as insignificant as the chart shows.
Originally Posted by vermaden
The benefit of this I don't quite get, and it has also left reviewers at other hardware sites scratching their heads. MPEG-4 ASP is not very computationally intensive codec, today's low-end CPUs can decode HD video in MPEG-4 ASP without problems.
Originally Posted by http://www.phoronix.com/scan.php?page=article&item=nvidia_gt_220&num=9
You are aware that most people buying this card will have poorly ventilated cases right? Not to mention that card makers choose what cooler to put on these low end models, I'm sure you'll be able to find one that is fanless on any site that has a decent selection, not to mention the fact that there are always much better aftermarket coolers out there, I've seen some that can even cool an 8800GTX without a fan.
Originally Posted by Ant P.
Most of the market that will be using these are the low end desktop market, for those that want light gaming capability for WoWcrack in thier underpowered bargain basement OEM box a.k.a. Dell, HP/Compaq, Gateway, Acer/eMachines, anything that will accept a gpu that wont require them to buy a new PSU to replace the anaemic one that their junker came with.
Agreed, do we really need more Nvidia rebagged cards? I dunno if any of you have been following the Nvidia vs. ATI internet fight kicked off by Nvidia for complaining that ATI beatthem to the punch with D3D11 hardware while their GT300/Fermi hardware is currently so nonexistent that they had to make fake cards for a recent expo and said that a prerendered video was their new chips work.
Originally Posted by L33F3R
Sources for both companies idiocy: http://www.xbitlabs.com/news/video/d...ics_Cards.html
And the Twiter fight... http://www.hardocp.com/news/2009/10/...ng_on_twitter/
I guess it means that the rumours about them having chip yields on the GT300 at only 1.7%, shame since they've been kicked out of their chipset business, they may go belly up if they can't pull their act together, if that happens hopefully it wont be Intel that snags them up, maybe a merger between VIA and Nvidia would work, it would at least give us a decent 3 way fight in the x86 market.
I agree. These kind of tests would be specially interesting to run with a crappy cpu to demonstrate the advantage of this approach, not with a core i7 monster that most people don't have.
Originally Posted by vermaden