Announcement

Collapse
No announcement yet.

NVIDIA Announces Turing-Based Quadro RTX GPUs As The "World's First Ray-Tracing GPU"

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • NVIDIA Announces Turing-Based Quadro RTX GPUs As The "World's First Ray-Tracing GPU"

    Phoronix: NVIDIA Announces Turing-Based Quadro RTX GPUs As The "World's First Ray-Tracing GPU"

    This morning AMD announced the Vega-based Radeon Pro WX 8200 graphics card as the "best workstation GPU under $1,000 USD" while tonight NVIDIA is trying to steal the thunder by announcing the Quadro RTX series as the "world's first ray-tracing GPU" that is also based on their new Turing architecture...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    32MB of memory
    32 whole megabytes

    Comment


    • #3
      Seems like NVLink is a major disadvantage after all...

      Originally posted by phoronix View Post
      The NVIDIA Quadro RTX 5000 packs 16GB of GDDR6 video memory or 32MB of memory with NVLink.
      Last edited by tildearrow; 13 August 2018, 09:06 PM.

      Comment


      • #4
        Exciting new stuff from nvidia?
        Linux users, RUN! Far, far away!

        Comment


        • #5
          Well, NVIDIA seem to conveniently forget PowerVR with their raytracing GPU, which have been around for quiet a while.

          Comment


          • #6
            Originally posted by blacknova View Post
            Well, NVIDIA seem to conveniently forget PowerVR with their raytracing GPU, which have been around for quiet a while.
            True, but that never gained much traction, did it? With Nvidia, one major player is pushing this so it might have more success. Unfortunately, I suspect this will be another vendor-centric technology. Some developers will use it but as long as consoles don't have it, wide adoption might be tricky. It's no fun to put lots of effort into developing something cool that only a handful of people can use.

            Comment


            • #7
              Originally posted by GruenSein View Post
              True, but that never gained much traction, did it?
              Imagination did sell PCIe-based raytracing accelerator cards for pro users. I have no idea how well they sold.

              Before that, there was at least one prior raytracing hardware vendor, with products on the market, but I don't know if that's the same company bought by Imagination.

              Originally posted by GruenSein View Post
              I suspect this will be another vendor-centric technology.
              MS has already added support for it, in Direct X:

              If you just want to see what DirectX Raytracing can do for gaming, check out the videos from Epic, Futuremark and EA, SEED.  To learn about the magic behind the curtain, keep reading. 3D Graphics is a Lie For the last thirty years,
              Last edited by coder; 14 August 2018, 03:10 AM.

              Comment


              • #8
                Any idea why they went for GDDR6 rather than HBM2?

                Comment


                • #9
                  Originally posted by FireBurn View Post
                  Any idea why they went for GDDR6 rather than HBM2?
                  I don't know.
                  HBM is much more expensive than GDDR, but on the other hand HBM is also faster.
                  But Quadro are expensive, so I would think they would use HBM.
                  HBM are usually used for expensive cards, while GDDR for the cheaper cards.

                  Comment


                  • #10
                    Originally posted by FireBurn View Post
                    Any idea why they went for GDDR6 rather than HBM2?
                    Cost. NVIDIA would have to sacrifice their margins if they went the HBM route. And most game engines aren't sensitive to memory bandwidth either, so it's not like the majority of cases are losing that much performance due to NVIDIA sticking to GDDR.

                    Comment

                    Working...
                    X