NVIDIA GeForce Power Efficiency: From The 6600GT "NV43" To The GTX 750 Ti "Maxwell"
When NVIDIA was doing their press briefings for their new Maxwell architecture they frequently talked up its power efficiency and how the power efficiency is four times greater than where it was four years ago with Fermi... But how is Maxwell and NVIDIA's power efficiency compared to hardware from ten years ago? In this article we have done fresh benchmarks -- with power consumption, thermal, and performance-per-Watt measurements -- of NVIDIA's mid-range graphics cards from the week-old GeForce GTX 750 Ti to as far back as the GeForce 6600GT (NV43) graphics card from 2004.
First of all, if you didn't already read our GM107 Maxwell review, be sure to read NVIDIA GeForce GTX 750 Ti "Maxwell": A Great Mid-Range GPU For Linux Users. That article has benchmarks from 21 different NVIDIA GeForce and AMD Radeon graphics cards on Linux, so even if you're not interested in the GM107 specifically, you can at least see some new Linux GPU numbers at large. Prior to the weekend I also published many more NVIDIA Maxwell Linux benchmarks from Ubuntu with NVIDIA's Linux graphics driver.
Being curious to see how the power efficiency of NVIDIA's hardware has become over the years, I ran some benchmarks of all the "mid-range" (loosely, there were also some higher and lower-end parts depending upon the GPUs I had in each series) of NVIDIA GeForce GPUs in my possession going back to the GeForce 6600GT.
The range of NVIDIA PCI Express graphics cards that I have in my possession that I was able to test with this mid-range focus was:
- GeForce 6600GT
With the GeForce 8 series through the GeForce GTX 750 Ti, the NVIDIA 334.16 Beta driver was used for Maxwell support. When testing the GeForce 6600GT graphics card, the NVIDIA 304.119 Linux driver was used for GeForce 6 series support since it was dropped from the mainline Linux driver after the 304 driver series.
Latest Linux Hardware Reviews
Latest Linux Articles
Latest Linux News
Latest Forum Discussions