When AMD had unveiled the ATI Radeon HD 4850 and ATI Radeon HD 4870 last month, NVIDIA was left in an awkward position. The Radeon HD 4850 had sharply outperformed the (more expensive, at the time) GeForce 9800GTX, which led NVIDIA to immediately begin slashing prices and introducing a slightly faster GeForce 9800GTX+ that ramped up the memory and core frequencies. ATI's flagship Radeon HD 4870 also had no problems competing with the more-expensive GeForce GTX 260 / 280. Many of NVIDIA's partners as a result have slashed their prices on their earlier GeForce 8 and 9 products. One of the NVIDIA products that was previously considered a good budget graphics card was the GeForce 8800GT, but now how does it stand up against the latest from ATI and NVIDIA? In this article we are looking at the ECS GeForce 8800GT. What is particularly special about this card and some of the other newer models shipping the GeForce 8800GT GPU is a BIOS revision that should yield a performance increase.
- Fully Supports Microsoft DirectX 10.0, OpenGL 2.0
- PCI Express 2.0 Interface
- 256MB GDDR3 RAM with 256-bit Data Bus
- Dual DVI-I With HDCP Support
- HDTV Output Support
- NVIDIA SLI Multi-GPU Supportive
- NVIDIA PureVideo HD Supportive
- 600MHz GPU Clock
- 1400MHz (700MHz x 2) Memory Clock
The box for the NVIDIA GeForce 8800GT 256MB from Elitegroup Computer Systems is bigger than what we would normally see for a graphics card this size. In addition, the box is formatted to stand up vertically, unlike most retail graphics card packages. On the front of the ECS packaging is a mythical character atop a flying dragon, while on the top and bottom sides are various emblems representing the various features of this budget graphics card. Included with the ECS N8800GT-256MX was a power adapter for taking two 4-pin molex connectors and adapting them to a single 6-pin PCI-E interface, HDTV output video adapter, one DVI to VGA adapter, VGA driver CD, and the ECS user's manual. No DVI to HDMI adapter was included nor any other accessories like is common with some of the more expensive graphics cards.