Originally posted by Nille
View Post
Announcement
Collapse
No announcement yet.
NVIDIA GeForce GT 220
Collapse
X
-
-
Originally posted by L33F3R View Postthe nvidia fanboy has a point though. There are alot of ATI fanboys here. Some people are here for the open driver religiously, and thats good for 2D apps. But others worship ATI like some sort of god.
The worst that ever happens here is trolling on each other's driver threads. It usually goes like this:
- NV worshipper: hey, what's up doc, check out this awesome H264 video! 5% CPU usage video here - what's yours? What you don't have VDPAU yet? Losers! Your blob sucks!
- AMD worshipper: yes, but watch our open drivers whoop the ass out of your binary blob. Awesome 2D performance, KMS *and* support for new kernels before they are even written!
- NV worshipper: ha! Our wine tastes better.
- AMD worshipper: not for long. Besides, our driver rises faster and lasts longer than yours.
- Intel worshipper: (timidly) err, may I say something?
- Both: SHUT UP!
Comment
-
Originally posted by Ant P. View Post*snip* The GT220 is not only uncompetitively slower than an equally-priced Radeon, it is crippled with half the memory.
*snip*
Comment
-
Originally posted by L33F3R View Postthe nvidia fanboy has a point though. There are alot of ATI fanboys here. Some people are here for the open driver religiously, and thats good for 2D apps. But others worship ATI like some sort of god.
The latest open source AMD driver supports all cards up to R7xx in ***3D*** and due to architectural similarities from R6xx through R8xx, this is very soon to expand to the very latest.
I don't know about you, but it is kinda nice when you don't need to worry about whether or not nvidia keeps up with the latest kernel and xorg versions... or whether or not nvidia decides to drop support for your card into "legacy" (aka dead).
And how about this;
When you are playing a video, WHO CARES how much CPU it uses? As long as it doesn't PEG your CPU, then the video will continue to playback smoothly. Any recent multi-core CPU can handle HD playback fully within software, but that is not where AMD is stopping... You see right now is a VERY exciting time for AMD drivers -- the R6xx7xx 3D support has just hit, the R300g (gallium 3D) driver (for R300-R500 hardware) is coming along while the R6xx MESA driver is being optimized and debugged. Soon work will begin on the R600g driver, and G3D will be used for (at least partial) video acceleration, which will bring HD video decoding down to being a light load on an older single-core CPU. And does it really matter if your CPU sits at 10% vs nvidia at 5%? It doesn't make *any* difference in playback.
Tell me this; how will NVidia compete with that? When AMD hardware *just works* for *everything*, nvidia will still be stuck up in their binary blob world and their customers will still be faced with the "what if nvidia doesn't keep supporting my card [fast enough]?" problem.
Don't get me wrong... I used to use nvidia hardware and would avoid ATI at all costs. But things have changed and continue to change. The last two nvidia devices I bought were a 7800GTX (when it was top end new stuff) and a 6100 (laptop). They worked, better than anything else available at the time, and they still work and are in use, BUT, the drivers have been a nightmare (as an example, there was a stretch of about 6 months when the nvidia drivers would entirely crap out when switching VT's or going into DPMS on the 6100), and since AMD promises long term support (through open source drivers) ***AND HAS IRREVOCABLY DELIVERED***, that is where I am now putting my money.
Point is this;
GIVEN equivalent hardware (I'm not interested in getting into quality or performance arguments by any means), I will select the device that promises a better overall long-term experience -- and right now that happens to be AMD. So to hell with fanboys, at the moment, AMD happens to be the best choice. Who knows -- maybe next year VIA will pull a rabbit out of its a$$ and make a discrete graphics card that blows everything else away, complete with clean, efficient, and complete open source G3D drivers .
Comment
-
While I appreciate the benchmarking that Phoronix does, I have to question why. It may be interesting to compare Linux performance to other distros, or Windows or even OSX. However, testing video cards seems a bit of a waste as Linux in general, and Linux gaming for that matter are still stuck at DX9 level, to use a Windows reference.
The only real difference in the modern cards is the DX level they understand, and the overall power of the GPU. Both of these factors are irrelevant to Linux. I still have an old EVGA 7600gt, and just purchased a new Zotac 7600gs for my spare machine. Both are more than adequate for anything Linux is capable of.
Comment
-
Originally posted by tlmck View PostWhile I appreciate the benchmarking that Phoronix does, I have to question why. It may be interesting to compare Linux performance to other distros, or Windows or even OSX. However, testing video cards seems a bit of a waste as Linux in general, and Linux gaming for that matter are still stuck at DX9 level, to use a Windows reference.
The only real difference in the modern cards is the DX level they understand, and the overall power of the GPU. Both of these factors are irrelevant to Linux. I still have an old EVGA 7600gt, and just purchased a new Zotac 7600gs for my spare machine. Both are more than adequate for anything Linux is capable of.
You are using your own experience and needs to extrapolate for other users. Needless to say, the results of this process aren't always applicable.
Comment
Comment