NVIDIA GeForce GTX 680 To RTX 2080 Ti Graphics/Compute Performance

Written by Michael Larabel in Graphics Cards on 20 September 2018 at 03:28 PM EDT. Page 1 of 8. 38 Comments.

Yesterday were the initial NVIDIA GeForce RTX 2080 Ti Linux benchmarks based upon my early testing of this high-end Turing graphics card paired with their new 410 Linux graphics driver. For your viewing pleasure today is a look at how the RTX 2080 Ti compares to the top-end cards going back to Kepler... Or, simply put, it's the GeForce GTX 680 vs. GTX 780 Ti vs. 980 Ti vs. 1080 Ti vs. 2080 Ti comparison with OpenGL and Vulkan graphics tests as well as some initial OpenCL / CUDA tests but more Turing GPU compute tests are currently being conducted. For making this historical comparison more interesting are also power consumption and performance-per-Watt metrics.

With the Linux support on the GeForce RTX 2080 Ti fairing well, one of the curiosity-driven tests was this comparison featuring the "[x]x80" series cards of Kepler, Maxwell, Pascal, and Turing for an interesting benchmarking look at the NVIDIA graphics/compute speed going back to the GTX 680 debut in 2012. The GTX 680, GTX 780 Ti, GTX 980 Ti, GTX 1080 Ti, and RTX 2080 Ti were all tested using this newest Linux driver release, 410.57 beta, while running on the Ubuntu 18.04 LTS box with the Linux 4.18 kernel.

NVIDIA GeForce GTX 680 To RTX 2080 Ti

The Phoronix Test Suite was used to facilitate all of these gaming/graphics benchmarks as well as GPU compute tests. The selection of tests was done with special care to try to pick workloads that would still run on the GTX 680 while not being ridiculously handicapped like bottlenecked by the vRAM yet still would scale through the GeForce RTX 2080 Ti -- there ends up being some interesting workloads in this mix. The Phoronix Test Suite was also monitoring the real-time AC system power consumption via a WattsUp Pro meter on a per-test basis to also generate accurate performance-per-Watt data. Additionally, the Phoronix Test Suite was monitoring the GPU core temperature for those curious about that aspect.


Related Articles