After the OpenGL performance benchmarks were out of the way, the Phoronix Test Suite was used again for handling some OpenGL benchmarks while monitoring the GPU temperature and system power consumption (using a USB-based WattsUp power meter) automatically via our open-source software.
Running these tests is simply a matter of running MONITOR=gpu.temp,sys.power phoronix-test-suite benchmark 1110138-AR-GTX550TIS65.
When the system was idling, the GeForce GTX 550 Ti had an average core temperature of 41°C and a peak of 44°C. This is in comparison to the Radeon HD 6770 with an average of just 29°C and the GeForce GTX 460 at 36°C. The factory-overclocked EVGA GeForce GTX 550 Ti 1GB graphics card was the warmest while idling, and even higher than the passively-cooled GeForce GT 520. The Phoronix Test Suite reads the NVIDIA GPU core temperature using the NV_CONTROL extension exposed through nvidia-settings and the Radeon core temperature is measured using interfaces for AMD OverDrive.
While idling, the Intel Core i5 2500K system with the GeForce GTX 550 Ti was consuming 79 Watts on average. With the Radeon HD 6770 installed the system was consuming 85 Watts, 78 Watts with the Radeon HD 6570, and 85 Watts with the GeForce GTX 460. Fortunately, both the AMD and NVIDIA binary drivers support dynamic power management under Linux. The Phoronix Test Suite interfaces with the WattsUp USB power meter for reading the AC power draw.