Announcement

Collapse
No announcement yet.

NVIDIA GeForce GT 220

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • NVIDIA GeForce GT 220

    Phoronix: NVIDIA GeForce GT 220

    Days prior to AMD's release of the ATI Radeon HD 5750 and Radeon HD 5770 graphics cards, NVIDIA released their GeForce G 210 and GeForce GT 220 graphics cards. Both of these NVIDIA graphics cards are for low-end desktop systems, but part of what makes them interesting is that they are the first NVIDIA GPUs built upon a TSMC 40nm process. To Linux users these graphics cards are also interesting in that they fully support all of the current features of VDPAU for Linux video decoding, including MPEG-4 support. We picked up an XFX GT220XZNF2 GeForce GT 220 1GB graphics card for this round of benchmarking on Ubuntu Linux.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Crap performance + no open driver: business as usual for NVidia.

    Comment


    • #3
      What do graphs on fifth site show? For me they show nothing. Scale on Y axis is wrong.

      Comment


      • #4
        Originally posted by remm View Post
        Crap performance + no open driver: business as usual for NVidia.
        Crap performance? This is juz a bottom end card. Look at the other Nvidia solutions that are compared to it.

        8400
        8500
        9500
        8600

        These are only how old? Ill give you a hand, Geforce 8 is November 8, 2006.

        Business as usual for nvidia? More like business 3 years old.

        To compare, the R700 is June 25, 2008. You are looking at what is a 2 year difference between the nvidia offerings and ATI ones. Doesnt seem like much when you are purchasing an oven but in this game it certainly does. This is just a crap offering by nvidia, that seems to be targeted towards HTPC's. I think this deals more of a blow to S3 then anything else.

        Comment


        • #5
          Finally an ATI vs. NVIDIA benchmark. I've been waiting for such a benchmark on Phoronix for as long as I know about Phoronix.

          Comment


          • #6

            Sorry, but you need to scale them down to 0-20% to show the difference, current scaling 0-100 shows absolutely nothing ... havent you seen that before posting?

            Comment


            • #7
              Low end, performance of cards 2 generations behind, 40nm process, and they STILL need a fan on it? What market segment is that possibly going to fill, the deaf HTPC user?

              Comment


              • #8
                Originally posted by vermaden View Post
                Sorry, but you need to scale them down to 0-20% to show the difference, current scaling 0-100 shows absolutely nothing ... havent you seen that before posting?
                In a way, this is correct. The difference between 5% and 10% CPU load is as insignificant as the chart shows.
                Originally posted by http://www.phoronix.com/scan.php?page=article&item=nvidia_gt_220&num=9
                Additionally, one of the advantages of this budget graphics card is its VDPAU capabilities, which includes MPEG-4 ASP support along with the features of earlier PureVideo generations.
                The benefit of this I don't quite get, and it has also left reviewers at other hardware sites scratching their heads. MPEG-4 ASP is not very computationally intensive codec, today's low-end CPUs can decode HD video in MPEG-4 ASP without problems.

                Comment


                • #9
                  Originally posted by Ant P. View Post
                  Low end, performance of cards 2 generations behind, 40nm process, and they STILL need a fan on it? What market segment is that possibly going to fill, the deaf HTPC user?
                  You are aware that most people buying this card will have poorly ventilated cases right? Not to mention that card makers choose what cooler to put on these low end models, I'm sure you'll be able to find one that is fanless on any site that has a decent selection, not to mention the fact that there are always much better aftermarket coolers out there, I've seen some that can even cool an 8800GTX without a fan.

                  Most of the market that will be using these are the low end desktop market, for those that want light gaming capability for WoWcrack in thier underpowered bargain basement OEM box a.k.a. Dell, HP/Compaq, Gateway, Acer/eMachines, anything that will accept a gpu that wont require them to buy a new PSU to replace the anaemic one that their junker came with.

                  Originally posted by L33F3R View Post
                  Crap performance? This is juz a bottom end card. Look at the other Nvidia solutions that are compared to it.

                  8400
                  8500
                  9500
                  8600

                  These are only how old? Ill give you a hand, Geforce 8 is November 8, 2006.

                  Business as usual for nvidia? More like business 3 years old.

                  To compare, the R700 is June 25, 2008. You are looking at what is a 2 year difference between the nvidia offerings and ATI ones. Doesnt seem like much when you are purchasing an oven but in this game it certainly does. This is just a crap offering by nvidia, that seems to be targeted towards HTPC's. I think this deals more of a blow to S3 then anything else.
                  Agreed, do we really need more Nvidia rebagged cards? I dunno if any of you have been following the Nvidia vs. ATI internet fight kicked off by Nvidia for complaining that ATI beatthem to the punch with D3D11 hardware while their GT300/Fermi hardware is currently so nonexistent that they had to make fake cards for a recent expo and said that a prerendered video was their new chips work.

                  Sources for both companies idiocy: http://www.xbitlabs.com/news/video/d...ics_Cards.html



                  WHAT DO YOU DO when you have a major conference planned to introduce a card, but you don’t have a card? You fake it, and Nvidia did just that. Updated 3x


                  And the Twiter fight... http://www.hardocp.com/news/2009/10/...ng_on_twitter/



                  I guess it means that the rumours about them having chip yields on the GT300 at only 1.7%, shame since they've been kicked out of their chipset business, they may go belly up if they can't pull their act together, if that happens hopefully it wont be Intel that snags them up, maybe a merger between VIA and Nvidia would work, it would at least give us a decent 3 way fight in the x86 market.

                  Comment


                  • #10
                    Originally posted by vermaden View Post
                    Sorry, but you need to scale them down to 0-20% to show the difference, current scaling 0-100 shows absolutely nothing ... havent you seen that before posting?
                    I agree. These kind of tests would be specially interesting to run with a crappy cpu to demonstrate the advantage of this approach, not with a core i7 monster that most people don't have.

                    Comment

                    Working...
                    X