Announcement

Collapse
No announcement yet.

NVIDIA GeForce GT 220

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by [Knuckles] View Post
    I agree. These kind of tests would be specially interesting to run with a crappy cpu to demonstrate the advantage of this approach, not with a core i7 monster that most people don't have.
    +1 maybe a P4 HT, Celeron D or Atholon 64 box with a gig of DDR 400 and a 7200rpm HDD with only 8Mb of cache. That and an Atom, miht have to get some cards that are on plain old PCI, like the 9400GT PCI to make it even more interesting.

    Or are we waiting for XvBA to drop so we can test it on an HD2400XT AGP with a P3 600?

    Comment


    • #12
      The Nvidia Geforce GT 220 is not a rebadged card whatsoever. It's based on the same architechture that powers the GT200 GPUs. It has double the amount of register file space over the Geforce 8/9 GPUs and the brand new VP4 decoder unit that can decode MPEG4 ASP.

      Device 0: "GeForce GT 220"
      CUDA Driver Version: 2.30
      CUDA Runtime Version: 2.30
      CUDA Capability Major revision number: 1
      CUDA Capability Minor revision number: 2
      Total amount of global memory: 1073414144 bytes
      Number of multiprocessors: 6
      Number of cores: 48
      Total amount of constant memory: 65536 bytes
      Total amount of shared memory per block: 16384 bytes
      Total number of registers available per block: 16384
      Warp size: 32
      Maximum number of threads per block: 512
      Maximum sizes of each dimension of a block: 512 x 512 x 64
      Maximum sizes of each dimension of a grid: 65535 x 65535 x 1
      Maximum memory pitch: 262144 bytes
      Texture alignment: 256 bytes
      Clock rate: 1.36 GHz
      Concurrent copy and execution: Yes
      Run time limit on kernels: Yes
      Integrated: No
      Support host page-locked memory mapping: Yes
      Compute mode: Default (multiple host threads can use this device simultaneously)
      Here's a few things you can do on the GT 220 that you can't do on previous generation GPUs.

      Specifications for Compute Capability 1.2
      Support for atomic functions operating in shared memory and atomic functions operating on 64-bit words in global memory (see Section B.10);
      Support for warp vote functions (see Section B.11);
      The number of registers per multiprocessor is 16384;
      The maximum number of active warps per multiprocessor is 32;
      The maximum number of active threads per multiprocessor is 1024.
      Nvidia has also improved the H.264 decoding performance, now VP4 can do 64FPS, the older generation hardware with VP2(mostly Geforce 8/9 GPUs) can only do 45FPS and VP3(8200/8300 chipset, 9300/9400/Ion chipset, G98 GPU) can only do about 51FPS.

      NVIDIA(0): NVIDIA GPU GeForce 210 (GT218) at PCI:4:0:0 (GPU-0)
      H264 DECODING (1920x1080): 64 frames/s

      NVIDIA(0): NVIDIA GPU ION (C79) at PCI:3:0:0 (GPU-0)
      H264 DECODING (1920x1080): 51 frames/s

      NVIDIA(0): NVIDIA GPU GeForce 9600 GT (G94) at PCI:1:0:0 (GPU-0)
      H264 DECODING (1920x1080): 45 frames/s
      The reason this particular card has lower performance is because it only has DDR2 memory instead of GDDR3, it only has half the bandwidth so it's crippled. The GDDR3 versions of the GT 220 has much higher performance over the DDR2 cards. See the benchmarks on Tomshardware, it comes very close to the 4670 in performance.

      Nvidia is introducing its first 40nm GPUs with DirectX 10.1 support. But the GeForce 210 and GT 220 GPUs aren't the flagships you might expect the company to announce hot on the heels of Radeon HD 5870. Rather, these are entry-level offerings under $80.


      This review is also somewhat disappointing because it doesn't even use the latest Nvidia 190.42 drivers.



      The people in this forum seems pretty clueless, if you don't know anything, don't even bother to post and spread misinformation.
      Last edited by GT220; 22 October 2009, 03:44 AM.

      Comment


      • #13
        Originally posted by GT220 View Post
        See the benchmarks on Tomshardware, it comes very close to the 4670 in performance.
        And anandtech shows the gt220 getting decimated by the 4670, especially at resolutions above 1024x768. And they're both about at the same price. Also there are passive 4670's available.

        We may see 56xx released by year's end (?). AMD will have to bomb to not sweep the floor with the gt220.

        Comment


        • #14
          Anandtech reviews a card with half the memory, not to mention they don't even use the latest 191.07 drivers like Tomshardware does. And don't even expect the 5600, it won't be out until next year. Besides, ATI's drivers, open or closed source is just laughably bad in Linux and there's no drivers at all for FreeBSD or Solaris. Nvidia just released 190.40 for Linux 32bit/64bit, FreeBSD and Solaris.

          Found another mistake in Phoronix's review.

          The graphics card supports HDMI audio, however, there is no integrated audio processor, but a HDA or SPDIF header must be connected from an audio source.
          Compare with Anandtech.

          Moving on, the other new HTPC feature is that NVIDIA has finally stepped up their game with respect to HDMI audio on cards with discrete GPUs. Gone is the S/PDIF cable to connect a card to an audio codec, which means NVIDIA is no longer limited to 2-channel LPCM or 5.1 channel DD/DTS for audio. Now they are passing audio over the PCIe bus, which gives them the ability to support additional formats. 8 channel LPCM is in, as are the lossy formats DD+ and 6 channel AAC.
          And the first 2 slide from PCPer, both mentioning built-in audio processor.



          The GT 220 cards don't even have any HDA or SPDIF headers to connect to.
          Last edited by GT220; 19 October 2009, 08:58 PM.

          Comment


          • #15
            Originally posted by GT220 View Post
            The Nvidia Geforce GT 220 is not a rebadged card whatsoever. It's based on the same architechture that powers the GT200 GPUs. It has double the amount of register file space over the Geforce 8/9 GPUs and the brand new VP4 decoder unit that can decode MPEG4 ASP.



            Here's a few things you can do on the GT 220 that you can't do on previous generation GPUs.



            Nvidia has also improved the H.264 decoding performance, now VP4 can do 64FPS, the older generation hardware with VP2(mostly Geforce 8/9 GPUs) can only do 45FPS and VP3(8200/8300 chipset, 9300/9400/Ion chipset, G98 GPU) can only do about 51FPS.



            The reason this particular card has lower performance is because it only has DDR2 memory instead of GDDR3, it only has half the bandwidth so it's crippled. The GDDR3 versions of the GT 220 has much higher performance over the DDR2 cards. See the benchmarks on Tomshardware, it comes very close to the 4670 in performance.

            Nvidia is introducing its first 40nm GPUs with DirectX 10.1 support. But the GeForce 210 and GT 220 GPUs aren't the flagships you might expect the company to announce hot on the heels of Radeon HD 5870. Rather, these are entry-level offerings under $80.


            This review is also somewhat disappointing because it doesn't even use the latest Nvidia 190.40 drivers.



            The people in this forum seems pretty clueless, if you don't know anything, don't even bother to post and spread misinformation.
            Whatever skippy, gotta get on that damage control now don't yah? Before youstart, yes, I know full well about the memory bandwidth limitations, it's why you can't take any card seriously with less then 128-bit GDDR3, and haven't been able to for more then 5 years. But if protecting the performance of the card was an imperative you'd mandate that the card can't be sold as GT220 without at least 128-bit GDDR3.

            As for the new features, all I see is a new version of CUDA, and around here care as much about CUDA as they do PhysX, in that they don't care much at all. Now if you where to drop some some non NDA open source 3D docs and maybe tout OpenCL a bit more we might actually pay more attention.

            But as it stands if you already have the 8600GTS or 9600GT you have no real compelling reason to upgrade as there isn't enough difference to warrant the purchase.

            Also, after years of Tom's inability to benchmark correctly, I wouldn't trust anything they post. You may remember when they where using heavily CPU limited titles for PGU reviews like Supreme Commander and MS Flight Sim, and wondered why they got little to no difference across several resolutions and performance points of card, the best was MS Flight Sim in which everything from the 8800 Ultra down to the 8400GS where getting 25FPS at every resolution.

            Lets also not forget how they couldn't interpret their own numbers on SSD drives or the blatant bias tward Intel and Nvidia, even when their own numbers said otherwise.

            And before you start I'm posting this from a C2D E4300 nForce 680i LT 8800GTS 320Mb 2x1Gb Patriot DDR2 800 @ 4-4-4-12.

            Comment


            • #16
              Nobody is doing damage control, Nvidia provides a much needed improved GPU for the low-end market with improved performance, feature set and power consumption. The obsolete previous gen products like 8600, 9500 are as good as End of Life and discontinued. Even Anandtech praised it for its low power consumption.

              Similarly we’re happy to see DirectX 10.1 support arrive on an NVIDIA part, and the 7W idle power usage on this card is amazing.
              And you're the clueless person, AMD has over 5 billion dollars in debt and has not has a profit since 2006(Core 2 Duo, Quad and Core i7 destroyed AMD and Sandy Bridge looks to do more of the same), Nvidia is fully profitable & has no debt and has great products in the pipeline like the next-gen 40nm Tegra SoC. Not to mention the fact Oak Ridge National Laboratory already gave Nvidia a huge design win by selecting Fermi. Wanna guess which company will go down first?



              When will AMD actually make money again? The question is becoming more important by the day since it carries over $5 billion in long-term debt.

              After losing almost $3 billion from 2007 – 2008, analysts expect the company to lose more money in 2009 and 2010.

              While the shares rallied from their February $2 low, they still appear stuck in a long-term down trend from $40 highs way back in 2006.
              Last edited by GT220; 19 October 2009, 09:22 PM.

              Comment


              • #17
                Originally posted by GT220 View Post
                Nobody is doing damage control, Nvidia provides a much needed improved GPU for the low-end market with improved performance, feature set and power consumption. The obsolete previous gen products like 8600, 9500 are as good as End of Life and discontinued. Even Anandtech praised it for its low power consumption.



                And you're the clueless person, AMD has over 5 billion dollars in debt and has not has a profit since 2006(Core 2 Duo, Quad and Core i7 destroyed AMD and Sandy Bridge looks to do more of the same), Nvidia is fully profitable & has no debt and has great products in the pipeline like the next-gen 40nm Tegra SoC. Not to mention the fact Oak Ridge National Laboratory already gave Nvidia a huge design win by selecting Fermi. Wanna guess which company will go down first?

                http://finance.yahoo.com/tech-ticker...S,M,GT,MYL,HTZ
                And Nvidia has been pushed out of the chipset market for everything but the ION, which will be getting the axe assoon as the SoC Atoms are released, has a long going problem of failing GPUs, the most recent of which is Sony http://www.semiaccurate.com/2009/08/...dia-notebooks/ and has generally been making an ass of itself all over the internet for the last month for reasons I posted earlier.

                Originally posted by GT220 View Post
                Nobody is doing damage control

                Could have fooled me.

                Comment


                • #18
                  So why do you still use Nvidia then? Since you want to be an ATI fanboi so much, go and buy their DX11 card now. I'm sure you'll enjoy using their Linux drivers.

                  Failing GPUs are only the old 80nm parts like 8600(G84) and 8400(G86) which are long discontinued, and you believe what the biggest ATI asskisser Charlie writes? LOLOL, dumber than dumb indeed you are. Most of Charlie's BS lies were debunked at Beyond3d forums.

                  Loss of chipset market is irrelevant, Nvidia is replacing that with revenue from the Tegra SoC, which will be far better. Intel is well known to be greedy, wanting the chipset market cake and eating it too. VIA should know that, they got driven out by Intel's greed.
                  Last edited by GT220; 19 October 2009, 09:50 PM.

                  Comment


                  • #19
                    When people start talking about a companies stock price and revenue numbers, it's a bad sign. It means they can't come up with anything to say about the hardware.

                    This card is actually a pretty decent buy, on Linux, because of the VDPAU support that it's competition lacks. Unfortunately for NVIDIA, it's nowhere close to being competitive on Windows where they make the majority of their sales.

                    Comment


                    • #20
                      GT 220 is competitive enough. Even on Windows ATI sucks badly in the driver dept, Nvidia's DXVA decoding support and compatability is far superior to ATI's. There's plenty of H.264 encodes that ATI cannot decode on their shitty UVD/UVD2, that the Nvidia first generation VP2 can do with the latest Nvidia drivers.

                      Comment

                      Working...
                      X