Announcement

Collapse
No announcement yet.

18-Way NVIDIA GPU Performance With Blender 2.90 Using OptiX + CUDA

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • 18-Way NVIDIA GPU Performance With Blender 2.90 Using OptiX + CUDA

    Phoronix: 18-Way NVIDIA GPU Performance With Blender 2.90 Using OptiX + CUDA

    A few days ago I published a deep dive into the CPU and GPU performance with Blender 2.90 as a major update to this open-source 3D modeling software. Following that I kept on testing more and older NVIDIA GPUs with the CUDA and OptiX back-end targets to now have an 18-way comparison from Maxwell to Turing with the new Blender 2.90...

    http://www.phoronix.com/scan.php?pag...18-NVIDIA-GPUs

  • #2
    It would appear the 2070 Super is that nice tipping point into high performance. At $500 MSRP, it’ll be interesting to see how it fares in comparison to the 3070 at the same price point.

    Cheers,
    Mike

    Comment


    • #3
      I'm slightly shocked that my (venerable? It's taken a heck of a beating over the last three years...) GTX1070 does so poorly.

      The 2070 Super does seem to be well positioned if you don't need >8GB VRAM.

      Comment


      • #4
        Not quite complete report of the results. An unfilled part, in the very first "Multi-Way Comparison" table is: "PERF-PER DOLLAR". This is probably the most important part. We generally assume that the greater the dollars, the better the expected performance.
        The standard journalistic background to the report might be: Standard to all 18 Nvidia cards tested here are: the standard 4K display monitor, the same AMD Ryzen 9 CPU on an Asus ROG motherboard, using the Nvidia 450.66 display driver, on a Ubuntu 20.04 GNOME operating system. It is assumed that the latest official drivers are factory updated by Ubuntu & Asus. Generally those wanting the best bench test results use the "LowLatency" official Linux kernels released by Ubuntu. This is not the case here (?).
        It is generally assumed that all the open source alternatives to the official Nvida drivers do not perform very well. This is not tested. It is also assumed that Wayland display driver is not worth testing.
        We can also assume that these comparison results will be consistent to all other operating systems. The greater the dollars spent on Nvidia hardware, the better the GPU performance. Deviation from the official Nvidia drivers will create lower performance.
        Intel created their own version of the Linux operating system. This hard-optimized operating system should also perform similar results. Phoronix tests here show that this Intel system also works best for the AMD range of CPU's. If Ubuntu dares to be tested with the official Ubuntu LowLatency Linux kernel, it remains very unknown if this would equal the official Intel version of Linux.

        Comment


        • #5
          I can agree on perf-per-dollar, but what is the point of testing open-source drivers if it is explicitly stated CUDA and OptiX?
          AFAIK no open-source driver support any of these with production quality if at all.

          Comment


          • #6
            gregzeng
            Why only Ubuntu I am running Debian testing/bullseye linux-image-5.7.0-3-amd64 version 5.7.17-1 with Nvidia 450.66 drivers. DE Mate.

            Comment


            • #7
              Originally posted by gregzeng View Post
              Not quite complete report of the results. An unfilled part, in the very first "Multi-Way Comparison" table is: "PERF-PER DOLLAR". This is probably the most important part. We generally assume that the greater the dollars, the better the expected performance.
              The standard journalistic background to the report might be: Standard to all 18 Nvidia cards tested here are: the standard 4K display monitor, the same AMD Ryzen 9 CPU on an Asus ROG motherboard, using the Nvidia 450.66 display driver, on a Ubuntu 20.04 GNOME operating system. It is assumed that the latest official drivers are factory updated by Ubuntu & Asus. Generally those wanting the best bench test results use the "LowLatency" official Linux kernels released by Ubuntu. This is not the case here (?).
              It is generally assumed that all the open source alternatives to the official Nvida drivers do not perform very well. This is not tested. It is also assumed that Wayland display driver is not worth testing.
              We can also assume that these comparison results will be consistent to all other operating systems. The greater the dollars spent on Nvidia hardware, the better the GPU performance. Deviation from the official Nvidia drivers will create lower performance.
              Intel created their own version of the Linux operating system. This hard-optimized operating system should also perform similar results. Phoronix tests here show that this Intel system also works best for the AMD range of CPU's. If Ubuntu dares to be tested with the official Ubuntu LowLatency Linux kernel, it remains very unknown if this would equal the official Intel version of Linux.
              You can fill in whatever values you want for perf-per-dollar in that table and see how it compares based on what prices you can get for those CPUs....
              Michael Larabel
              http://www.michaellarabel.com/

              Comment


              • #8
                Originally posted by blacknova View Post
                I can agree on perf-per-dollar, but what is the point of testing open-source drivers if it is explicitly stated CUDA and OptiX?
                AFAIK no open-source driver support any of these with production quality if at all.
                When clicking on the OpenBenchmarking.org link there is an area where you can fill in your own desired perf-per-dollar values and see in turn the impact.
                Michael Larabel
                http://www.michaellarabel.com/

                Comment

                Working...
                X