No announcement yet.

Initial NVIDIA GeForce RTX 2080 Ti Linux Benchmarks

  • Filter
  • Time
  • Show
Clear All
new posts

  • Initial NVIDIA GeForce RTX 2080 Ti Linux Benchmarks

    Phoronix: Initial NVIDIA GeForce RTX 2080 Ti Linux Benchmarks

    Here are the first of many benchmarks of the GeForce RTX 2080 Ti "Turing" graphics card under Linux with this initial piece exploring the OpenGL/Vulkan gaming performance.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Exciting! If you've got time some benchmarks of the tensor cores would be great, I bet there are lots of people interested in seeing deep learning performance.

    For example /


    • #3

      Originally posted by phoronix View Post
      As shown by the performance-per-dollar matrics,


      • #4
        That level of performance makes every AMD aficionado weep.
        That level of pricing makes everybody else weep.


        • #5
          Originally posted by bug77 View Post
          That level of performance makes every AMD aficionado weep.
          Not really. I couldn't use the card with a reasonable driver stack anyway.


          • #6
            Most if not all reviewers aren't impressed with this product launch. 2080 is pretty much 1080 Ti in terms of performance, where as 2080 Ti does offer some performance gains but at way higher price increase. Base case scenario is used 1080Ti (or less) or some AMD cards if Freesync is needed.


            • #7
              Man I hope the RTX cores turn out to be as useful as Nvidia is promoting them to be.


              • #8
                Pretty underwhelming card. This is another Geforce 3 or Geforce 5 launch, for those who remember the situation 17 years ago... Good cards with some features that will become mainstream eventually, but at the time of their launch really expensive and not many titles will take advantage of the new features. By the time raytracing effects become common better architectures for better prices will be released in the market, so it makes no sense to pay through the nose for a 2xxx series, unless you are rich and don't care about money or you are a developer and you want to experiment with the new technology.

                In any case, i am a Linux user, so Nvidia is not an option for me. I don't support companies hostile to opensource. I don't have a need for 4k AAA gaming either, if i did, i would be using Windows 10 anyway...


                • #9
                  Performance is pretty underwhelming, with a ~30% improvement on both Windows and GNU/Linux. Price to performance is pretty meh. Most reviewers also seem to share this feeling, and are telling their audience to shy against upgrading to these new cards for now.

                  Looking forward to seeing some compute and rendering benchmarks though. I want to see how well those RTX cores help in rendering scenes in blender, and how well the tensor cores do in AI/deep-learning applications. Then the price might be justifiable for those who want a cheaper Titan V alternative (like me).
                  Last edited by OMTDesign; 19 September 2018, 07:07 PM. Reason: Looked at some more RTX reviews.


                  • #10
                    The prices are high, but they do make sense. I think that the RTX 2080 ti should be considered the replacement of the Titan Xp, not the GTX 1080 ti. I also think that NVIDIA will reclaim the Titan brand for its original purpose, namely: top echelon Tesla series silicon in prosumer guise. The difference between the Quadro and Titan series will be that Quadro gets targeted at workstations for CAD etc. and Titan for developers of applications intended to run on the Tesla accelerators in servers.