NVIDIA Announces The GeForce RTX 50 "Blackwell" Series

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • ssokolow
    Senior Member
    • Nov 2013
    • 5132

    #11
    No thanks. I bought an RTX 3060 during Cyber Monday 2022 to replace my GTX 750 and it's already work enough managing the heat output from a 170W card that I tend to bandwidth-bottleneck at 125W according to nvidia-smi.

    Hell, aside from the odd emulator like PCSX2 for my PS2 with a dead optical drive, my repurposed 2012 HP prebuilt with a Radeon HD 5870 from 2009 as a "game console except not a console" does everything I want.

    Literally the only thing they could tempt me with to upgrade in less than another 10 years is a comparably performant, comparably sub-$400 CAD option with sufficiently more than the 12GiB of VRAM this card has. (Maybe 30GiB. I think that's the requirement for that experimental ML model for inferring 3D models from 2D images.)

    Comment

    • avis
      Senior Member
      • Dec 2022
      • 2277

      #12
      NVIDIA's Blackwell is mental:

      Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.




      Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.


      Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.


      And then there's Project Digits:

      Comment

      • Gabbb
        Phoronix Member
        • Aug 2023
        • 96

        #13
        If my current gpu dies thanks to blackwell maybe I'll get better stuff for less $ on the used market.

        Comment

        • Volta
          Senior Member
          • Apr 2019
          • 2314

          #14
          Trash from nvidia ruined gaming industry with crap like dlss and rtx. Oh, and there's path tracing to make them look even more stupid. Thankfully there's AMD and Intel.

          Comment

          • Inopia
            Senior Member
            • Aug 2015
            • 127

            #15
            Cool how they mention how much VRAM these cards will have...

            Comment

            • WannaBeOCer
              Senior Member
              • Jun 2020
              • 309

              #16
              Originally posted by sophisticles View Post

              To put these prices into perspective, the Geforce 2 Ultra launched in 2000 at a price of $500 which is about $916 today.

              The Geforce 3 launched in 2001 at a price of $500 which is about $891 today.

              Both of these cards where the top of the line offering at the time.

              The RTX 4090 launched at $1600 two years ago.

              The RTX 5070 price of $549 for a card that can match a RTX 4090 is not bad at all.

              Of course i would never spend this much money on a video card unless I was using it to make money, so...
              The GTX 8800 Ultra launched at $830 in 2007 which is $1,259.25 today. This isn't new, if Nvidia has the door wide open they'll price their Halo product at what ever they can. The Titan RTX for $2,500 as an example. The RTX 4090 was pretty much a Titan card with the 24GB of VRAM, you'll see them in plenty of AI workstations and won't be a surprise to see the RTX 5090 in them either.

              Also you have to remember Jensen Huang mentions the RTX 5070 wouldn't have been possible to match the performance of the RTX 4090 without AI. Meaning DLSS 4 Multi Frame Generation is the reason why it can compete with the RTX 4090: https://www.nvidia.com/en-us/geforce...i-innovations/

              Comment

              • mrg666
                Senior Member
                • Mar 2023
                • 1105

                #17
                I would buy a new GPU at these prices for AI or parallel computing only. A $300 GPU (RX7600XT or A770) is enough to play all games otherwise.

                Nvidia's prices are ridiculous and skimping on RAM even at those inflated prices are showing their greed like they did on 4060. Their infuriating ignorance of open source is the other problem. I am glad I can ignore them too. Nothing from Nvidia!

                Comment

                • Daktyl198
                  Senior Member
                  • Jul 2013
                  • 1593

                  #18
                  I'm going to have to wait for 3rd party benchmarks. Performance per watt is going to be a big one for me. I'm thinking they're pulling the same trick they pulled a few generations ago where they pump more power into an existing generation. And the price comparisons for older cards to say "Nvidia has priced this way forever" is just not how it works. Technology is getting cheaper all the time. Intel is putting out a card that performs as well as the 4070 for almost 1/2 the cost. There's no way in hell that the 4090 and 5090 cost anywhere near that much to manufacture.

                  Nvidia needs actual competition, not a puppy chasing their heels both in terms of performance and price like AMD.

                  Comment

                  • ms178
                    Senior Member
                    • Sep 2018
                    • 1713

                    #19
                    People are getting way too much hyped up from inflated numbers that were achieved with the help of frame generation in specific circumstances in selected titles. Let's wait for third party reviews, let's see how many games actually support the new cool tech and what street prices the cards launch at, then we will have all the relevant data to evaluate these cards. The same is true for RDNA4. The near silence of AMD about it stunned me. I hope they will need to be more aggressive on pricing finally to make a splash. Or people will simply buy the 5070 instead.
                    Last edited by ms178; 07 January 2025, 06:52 AM.

                    Comment

                    • ElderSnake
                      Senior Member
                      • Apr 2012
                      • 305

                      #20
                      How do they generate so many extra frames without hugely affecting input latency?

                      Comment

                      Working...
                      X