Announcement

Collapse
No announcement yet.

GeForce RTX 2080 Ti Linux Benchmarks Coming Today, NVIDIA Driver Bringing Vulkan RTX

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • GeForce RTX 2080 Ti Linux Benchmarks Coming Today, NVIDIA Driver Bringing Vulkan RTX

    Phoronix: GeForce RTX 2080 Ti Linux Benchmarks Coming Today, NVIDIA Driver Bringing Vulkan RTX

    NVIDIA's review/performance embargo has now lifted on the GeForce RTX 2080 series ahead of the cards shipping tomorrow. I should have out initial Linux benchmarks later today, assuming Linux driver availability...

    http://www.phoronix.com/scan.php?pag...-2080-Ti-Today

  • #2
    nouveau's reverse engineering toolkit envytools has started receiving patches for preliminary Turing support.

    Comment


    • #3
      I'm looking at a Windows review on YT, it is about 20/30% (depending on the game) faster than the previous gen. BUT, thanks to the large amount of money Nvidia is asking for the cards, they are way worse on the FPS/Dolar metric. In fact, they are worse than the worst AMD GPU on that regard. Yikes.

      Comment


      • #4
        So GDDR5 vs GDDR6 benches begins

        Comment


        • #5
          Wait, what? The Vulkan RTX extensions are are coming with the initial Turing drivers? I thought those weren't coming until some point weeks/months down the line.

          I do hope that this was just me making some incorrect assumptions on incomplete information rather than some those extensions being just early drafts and/or Nvidia's drivers devs being subject to a brutal death march* to deliver on time.


          *A so-called "death march", also known as crunch in the game industry, is when engineers are forced to work really long hours (think 80+ hour work weeks) for weeks to months on end in order to finish something. They're a rather common and infamous part of the games industry and at least used to be at Apple when Steve Jobs still ran the company.
          "Why should I want to make anything up? Life's bad enough as it is without wanting to invent any more of it."

          Comment


          • #6
            What about DLSS? how can we test that under Linux?

            Also the 2080Ti has GDRR5x, is this a bad thing? why not GDDR6?

            Comment


            • #7
              Originally posted by theriddick View Post
              What about DLSS? how can we test that under Linux?

              Also the 2080Ti has GDRR5x, is this a bad thing? why not GDDR6?
              https://www.nvidia.com/de-de/geforce...2080-ti/#specs ... for me that reads GDDR6

              Comment


              • #8
                ok, well some reviewers were saying ti had gddr5x..

                Comment


                • #9
                  Originally posted by theriddick View Post
                  What about DLSS? how can we test that under Linux?

                  Also the 2080Ti has GDRR5x, is this a bad thing? why not GDDR6?
                  DLSS for now requires Geforce Experience on Windows, and requires specific application support. Not sure how or if that will make it to Linux. Perhaps it could be set up with nvidia's fsaa env variables and point the driver to a machine training data file, but I'm doubtful on this for now.

                  Comment


                  • #10
                    Michael,

                    Some requests. Can you do some benchmarks with John the Ripper and hashcat, and other OpenCL benchmarks. more specifically, under linux with same CPU/motherboard, and compare the 2080 to the 1080.

                    Unlikely to happen, but if you can test multiple cards at once, to note if per-card efficiency drops when there are multiples

                    I would also like not just hash rate, but power consumption, and heat dissipation.

                    Comment

                    Working...
                    X