Announcement

Collapse
No announcement yet.

NVIDIA GeForce RTX 3000 Series Launches With Impressive Specs, Competitive Pricing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    I have not tried this, nor do I know anything about it, but I found it while searching around for GPU NVMe direct access.



    Looks like an interesting project.

    Comment


    • #42
      AMD still can't fix simple reset bugs in their GPU's and as a VFIO GPU Passthrough user, Nvidia is the only option for me to use in my guests

      Comment


      • #43
        Comparing specs on Wikipedia, it looks like "normal" GPU perf from the 3070 is pretty similar to the AMD 5700XT but overall compute power is up about double. I guess the main focus of the 30x line really is ray tracing. will be interesting to see real benchmarks.

        base/boost:
        276/318 GTexels - 96/111 GPixels - 17.66/20.37 TFLOPS (RTX3070)
        257/305 GTexels - 103/122 GPixels - 8.22/9.75 TFLOPS (5700XT)
        Last edited by HenryM; 02 September 2020, 12:37 AM.

        Comment


        • #44
          Originally posted by mdedetrich View Post
          I wonder what AMD's response is going to be, at least when it comes both to gaming and HPC cases NVidia is still far ahead. Unlike AMD, NVidia is also releasing new technology, i.e. their IO system which is basically the PC solution of what PS5/Xbox Series X does in regards to streaming directly from high performance SSD's to the GPU.
          If the Big Navi rumors are true, they should compete just fine with the 3080 on raster performance, while using less power. But I agree things like DLSS 3 and raytracing performance are likely to be big points in NVidia's favor, and they won't be competitive with the 3090 for people with money to burn.
          Last edited by smitty3268; 01 September 2020, 11:59 PM.

          Comment


          • #45
            On my side I don't really need AI in my GPU, or vendor locked-in ray-tracing, I just need good open source Vulkan drivers, low TDP, and high performance for 1080p/60 fps gaming. AMD still looks better in these criteria.

            Comment


            • #46
              Originally posted by dispat0r View Post

              We don't know what AMD has in store. Maybe the pricing of Nvidia is not going higher because of that. We will see
              That's unfortunately half the problem. It always seems like they're months or years behind Nvidia. If I had waited on AMD instead of buying my GTX 1080, it would have literally been years I would be waiting for the RX5000 series to come out. Unless you're building top-tier, I would have recommended the 5700XT even over a GTX 2080. Until you realize the RX5700XT came out barely a year ago and is about to be completely demolished by the next-gen GPUs.

              Comment


              • #47
                Originally posted by piotrj3 View Post

                I think it might be the case. In CPU coolers it is often the case like Noctua top tier coolers smashes very often even great water coolers. Here we have very special complex design that takes 3 slots.
                Yup, Noctua CPU fan coolers already compete with high end water coolers and can in some cases beat them depending on how bursty the CPU is. Gamers Nexus did a very in depth scientific review on this, and for sudden spikes in temperature CPU fan coolers are actually more efficient than water coolers.

                Originally posted by stormcrow View Post
                Yeah, we've been here before, and people will still buy them. Some gamers think having kW PSU to run their GPU is a badge of honor.

                UserFriendly from 2006 when NWN2 was released. To get the best graphics appearance out if it you needed the highest end power hungry GPUs on the market from nVidia.
                Would be helpful not to release FUD? The 3xxx series has a completely redesigned cooling and thermal solution and initial reviews by DigitalFoundry actually found that the card was very quiet. The only correct bit is that indeed it does use a lot of power.

                Originally posted by xcom View Post
                On my side I don't really need AI in my GPU, or vendor locked-in ray-tracing, I just need good open source Vulkan drivers, low TDP, and high performance for 1080p/60 fps gaming. AMD still looks better in these criteria.
                Ray tracing isn't locked in, its also part of the Vulkan API and AMD uses at well for Radeon (see consoles).

                Also the AI on the GPU is used to make the games look better, i.e. DLSS applies a deep learning neural network to make your game look better when upscaled to resolutions such as 4k. In fact that's the main method of running games well on 4K, it involves rendering the game on lower resolutions (such as 1080p) with expensive techniques such as raytracing and then using DLSS to upscale.

                In some cases (death stranding) the DLSS upscaling looks better than rendering the game natively on 4k.

                Originally posted by smitty3268 View Post

                If the Big Navi rumors are true, they should compete just fine with the 3080 on raster performance, while using less power. But I agree things like DLSS 3 and raytracing performance are likely to be big points in NVidia's favor, and they won't be competitive with the 3090 for people with money to burn.
                Well yeah thats the problem, even *if* they beat NVidia on raster performance (honestly its highly likely that they will match raster performance with a bit less power draw if anything) they are still very behind when it comes to Ray Tracing/DLSS 3 (and now the new IO/latency solutions).

                Even if the rastering performance is greater, its not going to beat the graphics of a game that uses rastering + ray tracing + DLSS 3. For example without DLSS 3, even if your rastering is better; if you try to raster games at 4K with ray tracing your actual frame rate will be a lot lower.

                The way I see it, the only disadvantage of these cards is that they draw more power but they don't have the typical associated downsides (noisy/thermals). Given history, there is also a change that ATI's cards could even be louder/noiser even though they are using a better node, we will see.
                Last edited by mdedetrich; 02 September 2020, 03:52 AM.

                Comment


                • #48
                  Originally posted by computerquip View Post

                  That's unfortunately half the problem. It always seems like they're months or years behind Nvidia. If I had waited on AMD instead of buying my GTX 1080, it would have literally been years I would be waiting for the RX5000 series to come out. Unless you're building top-tier, I would have recommended the 5700XT even over a GTX 2080. Until you realize the RX5700XT came out barely a year ago and is about to be completely demolished by the next-gen GPUs.
                  I agree AMDs graphics division has been a little bit lackluster for the high end crowd. The question is why didn't Nvidia increase there prices again but that could also be related to the world wide recession at the moment. I will be waiting for September to see if AMD is announcing something. I'm also using Nvidia for the last 10 years.

                  Comment


                  • #49
                    Originally posted by dispat0r View Post

                    I agree AMDs graphics division has been a little bit lackluster for the high end crowd. The question is why didn't Nvidia increase there prices again but that could also be related to the world wide recession at the moment. I will be waiting for September to see if AMD is announcing something. I'm also using Nvidia for the last 10 years.
                    This is traditionally why AMD always attacked the mid range GPU's. Unfortunately the gap is now getting a lot smaller thanks to NVidia releasing techniques like DLSS which allows mid tier (or slightly higher than mid tier depending on your definition) graphics card compete with what was traditionally reserved for titan level cards.
                    Last edited by mdedetrich; 02 September 2020, 03:07 AM.

                    Comment


                    • #50
                      Originally posted by rmfx View Post
                      The new gen is more than 2x cheaper/perf than the previous one, what's ridiculous is your statement. It's more expensive than in 2015, but the dies are also bigger.
                      I don't give a crap about current gen GPUs, I just want old gen GPU prices to not stay constant for 5 years.

                      Comment

                      Working...
                      X