Originally posted by GT220
View Post
Announcement
Collapse
No announcement yet.
NVIDIA GeForce GT 220
Collapse
X
-
Like I said, I think NVIDIA can make a good case for these cards on Linux. For the same price you've got 3 distinct options:
1. ATI with horrible/buggy 3d performance + 0SS drivers
2. ATI with good (relatively) performance with binary drivers that are still incomplete/buggy
3. NV with mediocre performance but features like VDPAU in the solid binary drivers.
I suspect NVidia would probably win the majority of the marketshare based on the 3 options above. The problem is that in Windows the drivers are fairly even and NV just can't match up with the hardware.
Comment
-
Originally posted by GT220 View PostSo why do you still use Nvidia then? Since you want to be an ATI fanboi so much, go and buy their DX11 card now. I'm sure you'll enjoy using their Linux drivers.
Failing GPUs are only the old 80nm parts like 8600(G84) and 8400(G86) which are long discontinued, and you believe what the biggest ATI asskisser Charlie writes? LOLOL, dumber than dumb indeed you are. Most of Charlie's BS lies were debunked at Beyond3d forums.
Loss of chipset market is irrelevant, Nvidia is replacing that with revenue from the Tegra SoC, which will be far better. Intel is well known to be greedy, wanting the chipset market cake and eating it too. VIA should know that, they got driven out by Intel's greed.
Loss of the chipset market is by no means irrelevant, especially since it only leaves you with a delayed series of GPU, and the hope that the next gen Gameboy doesn't flop, not everything Nintendo touches is gold, remember the that the N64 did nowhere near what was expected and that the Virtual Boy is the bastard stepchild they keep locked under the basement stairs. Lets also not forget that the 800Lb gorilla in the room, Intel, is looking to take the GPU market by releasing their own GPU for the low to mid range, I wouldn't expect the Larabee to take the high end till at least it's 3rd generation, but Intel having a GPU in the market that is anything better then an IGP means that they will be able to further leverage themselves till they force everyone else out.
VIA was driven out by failing at just about everything they did since they made S3TC. Though thats not to say with a 3rd party chipset and GPU that a scaled up Nano couldn't get them back into the low end desktop market.
Who says I'm not planning on going with ATI/AMD on my next build? I'm looking for a buyer for this box first.
Originally posted by GT220 View PostSince you want to be an ATI fanboi so much, go and buy their DX11 card now. I'm sure you'll enjoy using their Linux drivers.
LOLOL, dumber than dumb indeed you are.
Comment
-
This card has 48 cores??? Shouldn't it be getting higher benchmarks?
Maybe the drivers or benchmarking software are not taking advantage of this? Not a bad price for a new low end card.
I have a GTX 285 now and I'm pretty sure I'm not using both cores in Jaunty Jackalope with the 185.18.36 driver.
Can't wait to see what the Nvidia open source drivers can do!
Comment
-
Nvidia is starting to look worryingly similar to 3dfx when they tried to release the Voodoo 6000 (albeit vastly larger and more efficient). I'm sure some people are rubbing their hands with glee now - Nvidia acquired a lot of bad karma and ill-will when they bought 3dfx...
Originally posted by smitty3268 View PostOriginally posted by GT220GT 220 is competitive enough. Even on Windows ATI sucks badly in the driver dept, Nvidia's DXVA decoding support and compatability is far superior to ATI's. There's plenty of H.264 encodes that ATI cannot decode on their shitty UVD/UVD2, that the Nvidia first generation VP2 can do with the latest Nvidia drivers.
For example this "shitty drivers on Windows" argument. He forgot to mention that Nvidia drivers were responsible for 3x times as much crashes as Ati drivers in 2007 (28.8% vs 9.3%). (If you have any more recent statistics, I'd love to hear them!)
My personal experience is that NV drivers were beyond shitty on Vista for over a year after its release (Ati drivers worked pretty much fine since Vista RC1). Even now, multi-monitor support is a crapshoot with Nvidia's drivers forgetting or setting the wrong resolution time after time (Ati's drivers work correctly in the same setup).
Originally posted by cliffThis card has 48 cores??? Shouldn't it be getting higher benchmarks?
This is a low end card with low end performance. For comparison, Ati's flagship (the 5870) has 1600 cores, each one more capable than this card's paltry 48. Of course, this doesn't say the whole story either: the 5870 uses GDDR5 memory (vs GDDR2 or 3 for this card) and its architecture is vastly different, both more efficient and more capable (shader model 5 vs 4.1).
As I said, you won't find any truth in a shill's words that's not twisted beyond any recognition.
The bottom line is that this card is a marginal improvement over Nvidia's previous low-end hardware. It's mostly an attempt to increase margins for Nvidia and tide them over to the GT3xx release. There's no point in upgrading if you are already using a 8600/9500/9600 and there's little point in putting this card into an HTPC as long as it has a fan.
I'd only recommend this card if (a) you can find it fanless and (b) you want to build a low-powered Linux box. In all other cases, there are better choices than the GT220:
- Nvidia's 9500 for a fanless, low-power Linux box.
- Ati's 4670 for a fanless, low-power Windows box that can also play some games.
- Ati's 5xx0 series for more serious gaming.
Edit: spelling mistakes...Last edited by BlackStar; 20 October 2009, 04:14 AM.
Comment
-
Originally posted by GT220 View PostGT 220 is competitive enough. Even on Windows ATI sucks badly in the driver dept, Nvidia's DXVA decoding support and compatability is far superior to ATI's. There's plenty of H.264 encodes that ATI cannot decode on their shitty UVD/UVD2, that the Nvidia first generation VP2 can do with the latest Nvidia drivers.
Comment
-
With AMD's Windows drivers, H.264 decoding via DXVA is restricted to level 4.1, while NVidia's drivers can decode up to level 5.0 (like VDPAU). I guess this is just a software issue though.
It's clear that AMD has the superior hardware at the moment, but with the crappy driver situation it simply is a no-go.
Comment
Comment