Announcement

Collapse
No announcement yet.

NVIDIA GeForce GT 220

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • NVIDIA GeForce GT 220

    Phoronix: NVIDIA GeForce GT 220

    Days prior to AMD's release of the ATI Radeon HD 5750 and Radeon HD 5770 graphics cards, NVIDIA released their GeForce G 210 and GeForce GT 220 graphics cards. Both of these NVIDIA graphics cards are for low-end desktop systems, but part of what makes them interesting is that they are the first NVIDIA GPUs built upon a TSMC 40nm process. To Linux users these graphics cards are also interesting in that they fully support all of the current features of VDPAU for Linux video decoding, including MPEG-4 support. We picked up an XFX GT220XZNF2 GeForce GT 220 1GB graphics card for this round of benchmarking on Ubuntu Linux.

    http://www.phoronix.com/vr.php?view=14273

  • #2
    Crap performance + no open driver: business as usual for NVidia.

    Comment


    • #3
      What do graphs on fifth site show? For me they show nothing. Scale on Y axis is wrong.

      Comment


      • #4
        Originally posted by remm View Post
        Crap performance + no open driver: business as usual for NVidia.
        Crap performance? This is juz a bottom end card. Look at the other Nvidia solutions that are compared to it.

        8400
        8500
        9500
        8600

        These are only how old? Ill give you a hand, Geforce 8 is November 8, 2006.

        Business as usual for nvidia? More like business 3 years old.

        To compare, the R700 is June 25, 2008. You are looking at what is a 2 year difference between the nvidia offerings and ATI ones. Doesnt seem like much when you are purchasing an oven but in this game it certainly does. This is just a crap offering by nvidia, that seems to be targeted towards HTPC's. I think this deals more of a blow to S3 then anything else.

        Comment


        • #5
          Finally an ATI vs. NVIDIA benchmark. I've been waiting for such a benchmark on Phoronix for as long as I know about Phoronix.

          Comment


          • #6

            Sorry, but you need to scale them down to 0-20% to show the difference, current scaling 0-100 shows absolutely nothing ... havent you seen that before posting?

            Comment


            • #7
              Low end, performance of cards 2 generations behind, 40nm process, and they STILL need a fan on it? What market segment is that possibly going to fill, the deaf HTPC user?

              Comment


              • #8
                Originally posted by vermaden View Post
                Sorry, but you need to scale them down to 0-20% to show the difference, current scaling 0-100 shows absolutely nothing ... havent you seen that before posting?
                In a way, this is correct. The difference between 5% and 10% CPU load is as insignificant as the chart shows.
                Originally posted by http://www.phoronix.com/scan.php?page=article&item=nvidia_gt_220&num=9
                Additionally, one of the advantages of this budget graphics card is its VDPAU capabilities, which includes MPEG-4 ASP support along with the features of earlier PureVideo generations.
                The benefit of this I don't quite get, and it has also left reviewers at other hardware sites scratching their heads. MPEG-4 ASP is not very computationally intensive codec, today's low-end CPUs can decode HD video in MPEG-4 ASP without problems.

                Comment


                • #9
                  Originally posted by Ant P. View Post
                  Low end, performance of cards 2 generations behind, 40nm process, and they STILL need a fan on it? What market segment is that possibly going to fill, the deaf HTPC user?
                  You are aware that most people buying this card will have poorly ventilated cases right? Not to mention that card makers choose what cooler to put on these low end models, I'm sure you'll be able to find one that is fanless on any site that has a decent selection, not to mention the fact that there are always much better aftermarket coolers out there, I've seen some that can even cool an 8800GTX without a fan.

                  Most of the market that will be using these are the low end desktop market, for those that want light gaming capability for WoWcrack in thier underpowered bargain basement OEM box a.k.a. Dell, HP/Compaq, Gateway, Acer/eMachines, anything that will accept a gpu that wont require them to buy a new PSU to replace the anaemic one that their junker came with.

                  Originally posted by L33F3R View Post
                  Crap performance? This is juz a bottom end card. Look at the other Nvidia solutions that are compared to it.

                  8400
                  8500
                  9500
                  8600

                  These are only how old? Ill give you a hand, Geforce 8 is November 8, 2006.

                  Business as usual for nvidia? More like business 3 years old.

                  To compare, the R700 is June 25, 2008. You are looking at what is a 2 year difference between the nvidia offerings and ATI ones. Doesnt seem like much when you are purchasing an oven but in this game it certainly does. This is just a crap offering by nvidia, that seems to be targeted towards HTPC's. I think this deals more of a blow to S3 then anything else.
                  Agreed, do we really need more Nvidia rebagged cards? I dunno if any of you have been following the Nvidia vs. ATI internet fight kicked off by Nvidia for complaining that ATI beatthem to the punch with D3D11 hardware while their GT300/Fermi hardware is currently so nonexistent that they had to make fake cards for a recent expo and said that a prerendered video was their new chips work.

                  Sources for both companies idiocy: http://www.xbitlabs.com/news/video/d...ics_Cards.html

                  http://www.tweaktown.com/news/13199/...ions/index.htm

                  http://www.semiaccurate.com/2009/10/...mi-boards-gtc/

                  And the Twiter fight... http://www.hardocp.com/news/2009/10/...ng_on_twitter/



                  I guess it means that the rumours about them having chip yields on the GT300 at only 1.7%, shame since they've been kicked out of their chipset business, they may go belly up if they can't pull their act together, if that happens hopefully it wont be Intel that snags them up, maybe a merger between VIA and Nvidia would work, it would at least give us a decent 3 way fight in the x86 market.

                  Comment


                  • #10
                    Originally posted by vermaden View Post
                    Sorry, but you need to scale them down to 0-20% to show the difference, current scaling 0-100 shows absolutely nothing ... havent you seen that before posting?
                    I agree. These kind of tests would be specially interesting to run with a crappy cpu to demonstrate the advantage of this approach, not with a core i7 monster that most people don't have.

                    Comment


                    • #11
                      Originally posted by [Knuckles] View Post
                      I agree. These kind of tests would be specially interesting to run with a crappy cpu to demonstrate the advantage of this approach, not with a core i7 monster that most people don't have.
                      +1 maybe a P4 HT, Celeron D or Atholon 64 box with a gig of DDR 400 and a 7200rpm HDD with only 8Mb of cache. That and an Atom, miht have to get some cards that are on plain old PCI, like the 9400GT PCI to make it even more interesting.

                      Or are we waiting for XvBA to drop so we can test it on an HD2400XT AGP with a P3 600?

                      Comment


                      • #12
                        The Nvidia Geforce GT 220 is not a rebadged card whatsoever. It's based on the same architechture that powers the GT200 GPUs. It has double the amount of register file space over the Geforce 8/9 GPUs and the brand new VP4 decoder unit that can decode MPEG4 ASP.

                        Device 0: "GeForce GT 220"
                        CUDA Driver Version: 2.30
                        CUDA Runtime Version: 2.30
                        CUDA Capability Major revision number: 1
                        CUDA Capability Minor revision number: 2
                        Total amount of global memory: 1073414144 bytes
                        Number of multiprocessors: 6
                        Number of cores: 48
                        Total amount of constant memory: 65536 bytes
                        Total amount of shared memory per block: 16384 bytes
                        Total number of registers available per block: 16384
                        Warp size: 32
                        Maximum number of threads per block: 512
                        Maximum sizes of each dimension of a block: 512 x 512 x 64
                        Maximum sizes of each dimension of a grid: 65535 x 65535 x 1
                        Maximum memory pitch: 262144 bytes
                        Texture alignment: 256 bytes
                        Clock rate: 1.36 GHz
                        Concurrent copy and execution: Yes
                        Run time limit on kernels: Yes
                        Integrated: No
                        Support host page-locked memory mapping: Yes
                        Compute mode: Default (multiple host threads can use this device simultaneously)
                        Here's a few things you can do on the GT 220 that you can't do on previous generation GPUs.

                        Specifications for Compute Capability 1.2
                        Support for atomic functions operating in shared memory and atomic functions operating on 64-bit words in global memory (see Section B.10);
                        Support for warp vote functions (see Section B.11);
                        The number of registers per multiprocessor is 16384;
                        The maximum number of active warps per multiprocessor is 32;
                        The maximum number of active threads per multiprocessor is 1024.
                        Nvidia has also improved the H.264 decoding performance, now VP4 can do 64FPS, the older generation hardware with VP2(mostly Geforce 8/9 GPUs) can only do 45FPS and VP3(8200/8300 chipset, 9300/9400/Ion chipset, G98 GPU) can only do about 51FPS.

                        NVIDIA(0): NVIDIA GPU GeForce 210 (GT218) at PCI:4:0:0 (GPU-0)
                        H264 DECODING (1920x1080): 64 frames/s

                        NVIDIA(0): NVIDIA GPU ION (C79) at PCI:3:0:0 (GPU-0)
                        H264 DECODING (1920x1080): 51 frames/s

                        NVIDIA(0): NVIDIA GPU GeForce 9600 GT (G94) at PCI:1:0:0 (GPU-0)
                        H264 DECODING (1920x1080): 45 frames/s
                        The reason this particular card has lower performance is because it only has DDR2 memory instead of GDDR3, it only has half the bandwidth so it's crippled. The GDDR3 versions of the GT 220 has much higher performance over the DDR2 cards. See the benchmarks on Tomshardware, it comes very close to the 4670 in performance.

                        http://www.tomshardware.com/reviews/...-220,2445.html

                        This review is also somewhat disappointing because it doesn't even use the latest Nvidia 190.42 drivers.

                        http://www.nvnews.net/vbulletin/show....php?p=2105790

                        The people in this forum seems pretty clueless, if you don't know anything, don't even bother to post and spread misinformation.
                        Last edited by GT220; 10-22-2009, 03:44 AM.

                        Comment


                        • #13
                          Originally posted by GT220 View Post
                          See the benchmarks on Tomshardware, it comes very close to the 4670 in performance.
                          And anandtech shows the gt220 getting decimated by the 4670, especially at resolutions above 1024x768. And they're both about at the same price. Also there are passive 4670's available.

                          We may see 56xx released by year's end (?). AMD will have to bomb to not sweep the floor with the gt220.

                          Comment


                          • #14
                            Anandtech reviews a card with half the memory, not to mention they don't even use the latest 191.07 drivers like Tomshardware does. And don't even expect the 5600, it won't be out until next year. Besides, ATI's drivers, open or closed source is just laughably bad in Linux and there's no drivers at all for FreeBSD or Solaris. Nvidia just released 190.40 for Linux 32bit/64bit, FreeBSD and Solaris.

                            Found another mistake in Phoronix's review.

                            The graphics card supports HDMI audio, however, there is no integrated audio processor, but a HDA or SPDIF header must be connected from an audio source.
                            Compare with Anandtech.

                            Moving on, the other new HTPC feature is that NVIDIA has finally stepped up their game with respect to HDMI audio on cards with discrete GPUs. Gone is the S/PDIF cable to connect a card to an audio codec, which means NVIDIA is no longer limited to 2-channel LPCM or 5.1 channel DD/DTS for audio. Now they are passing audio over the PCIe bus, which gives them the ability to support additional formats. 8 channel LPCM is in, as are the lossy formats DD+ and 6 channel AAC.
                            And the first 2 slide from PCPer, both mentioning built-in audio processor.

                            http://www.pcper.com/article.php?aid=794

                            The GT 220 cards don't even have any HDA or SPDIF headers to connect to.
                            Last edited by GT220; 10-19-2009, 08:58 PM.

                            Comment


                            • #15
                              Originally posted by GT220 View Post
                              The Nvidia Geforce GT 220 is not a rebadged card whatsoever. It's based on the same architechture that powers the GT200 GPUs. It has double the amount of register file space over the Geforce 8/9 GPUs and the brand new VP4 decoder unit that can decode MPEG4 ASP.



                              Here's a few things you can do on the GT 220 that you can't do on previous generation GPUs.



                              Nvidia has also improved the H.264 decoding performance, now VP4 can do 64FPS, the older generation hardware with VP2(mostly Geforce 8/9 GPUs) can only do 45FPS and VP3(8200/8300 chipset, 9300/9400/Ion chipset, G98 GPU) can only do about 51FPS.



                              The reason this particular card has lower performance is because it only has DDR2 memory instead of GDDR3, it only has half the bandwidth so it's crippled. The GDDR3 versions of the GT 220 has much higher performance over the DDR2 cards. See the benchmarks on Tomshardware, it comes very close to the 4670 in performance.

                              http://www.tomshardware.com/reviews/...-220,2445.html

                              This review is also somewhat disappointing because it doesn't even use the latest Nvidia 190.40 drivers.

                              http://www.nvnews.net/vbulletin/show....php?p=2105790

                              The people in this forum seems pretty clueless, if you don't know anything, don't even bother to post and spread misinformation.
                              Whatever skippy, gotta get on that damage control now don't yah? Before youstart, yes, I know full well about the memory bandwidth limitations, it's why you can't take any card seriously with less then 128-bit GDDR3, and haven't been able to for more then 5 years. But if protecting the performance of the card was an imperative you'd mandate that the card can't be sold as GT220 without at least 128-bit GDDR3.

                              As for the new features, all I see is a new version of CUDA, and around here care as much about CUDA as they do PhysX, in that they don't care much at all. Now if you where to drop some some non NDA open source 3D docs and maybe tout OpenCL a bit more we might actually pay more attention.

                              But as it stands if you already have the 8600GTS or 9600GT you have no real compelling reason to upgrade as there isn't enough difference to warrant the purchase.

                              Also, after years of Tom's inability to benchmark correctly, I wouldn't trust anything they post. You may remember when they where using heavily CPU limited titles for PGU reviews like Supreme Commander and MS Flight Sim, and wondered why they got little to no difference across several resolutions and performance points of card, the best was MS Flight Sim in which everything from the 8800 Ultra down to the 8400GS where getting 25FPS at every resolution.

                              Lets also not forget how they couldn't interpret their own numbers on SSD drives or the blatant bias tward Intel and Nvidia, even when their own numbers said otherwise.

                              And before you start I'm posting this from a C2D E4300 nForce 680i LT 8800GTS 320Mb 2x1Gb Patriot DDR2 800 @ 4-4-4-12.

                              Comment

                              Working...
                              X