1080p NVIDIA Linux Comparison From GeForce 8 To GeForce 900 Series
Written by Michael Larabel in Graphics Cards on 23 January 2016. Page 1 of 3. 13 Comments

Earlier this week I carried out an OpenGL performance comparison of NVIDIA GPUs going back 10 years that included 27 different graphics cards from the GeForce 8 series through the latest-generation GeForce 900 Maxwell graphics cards. In this weekend article are some complementary tests from this comparison with the OpenGL benchmarks at 1920 x 1080.

The OpenGL benchmarks done earlier this week for the 27-way GeForce graphics card comparison were done at 2560 x 1600. That resolution was used since all of the cards going back to the GeForce 8 series could mode-set to that resolution with the Samsung SyncMaster 30 display as the highest supported by all of the graphics cards tested. I didn't use the more common 1920 x 1080 as that would leave many of the newer NVIDIA GPUs more CPU bound in not being able to fully stress the Kepler and Maxwell graphics cards that can now even handle 4K gaming in many instances with ease. However, with the 2560 x 1600 testing, it's rather stressful for the older NVIDIA GeForce 8/9 graphics cards where 256~512MB of video RAM was common. So in honoring the feedback of some that were curious about 1920 x 1080 so that the older GPUs could cope better, I ran some follow-up tests.

With a sub-set of the graphics cards used for the major comparison earlier this week, I re-ran the benchmarks at 1920 x 1080 on BioShock Infinite, Metro Last Light Redux, OpenArena, Unigine Tropics, Unigine Valley, and Xonotic. Thanks to the Phoronix Test Suite with its test automation, it made doing a second roundabout of this NVIDIA GPU benchmarking very easy.

The smaller set of graphics cards I used for this comparison included the GeForce 8500GT, 8600GT, 8600GTS, 9600GSO, 9800GTX, GT 220, GT 240, GTX 460, GT 520, GTX 550 Ti, GT 610, GTX 650, GTX 680, GTX 750, GTX 750 Ti, GTX 760, GTX 950, GTX 960, GTX 970, and GTX 980. Some of the higher-end graphics cards were left out since they're simply more CPU bound at 1080p.

With this second round of testing I also didn't do any performance-per-Watt tests. So if you are interested in the full 27-way comparison and power efficiency results, see the NVIDIA 10-year Linux OpenGL comparison from earlier this week for having a full understanding how the NVIDIA GPU performance has evolved.

From the Ubuntu 15.10 64-bit system, NVIDIA 358.16 was used for testing the cards on the GeForce GTX 400 "Fermi" series and newer while the legacy NVIDIA 340.96 driver was used for the pre-Fermi graphics cards. If you like all of the Linux hardware benchmarking we do daily at Phoronix, consider joining Phoronix Premium by taking part in this week's special deal.