NVIDIA Announces The GeForce RTX 50 "Blackwell" Series

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • sobrus
    Senior Member
    • Apr 2021
    • 191

    #31
    Originally posted by kurkosdr View Post
    Can the GTX 880M GPU in my old laptop also become an RTX 4090 with enough DLSS? Happy to render at 640x360 resolution and 10 frames per second, since DLSS will fix it all and make it look like 4K 120fps.
    Exactly my thoughts. But last time they advertised 40 series as being 3x faster while it was 3% faster. So there IS an improvement....

    Moreover, 5080 (which is $1000 card) will reportedly have just 16GB of memory - same amount as $600 Radeon 6800 back in 2020.
    This is far more concerning, but, on the other hand, that means my old 6800xt will probably serve me a few years more.
    Last edited by sobrus; 07 January 2025, 09:46 AM.

    Comment

    • pWe00Iri3e7Z9lHOX2Qx
      Senior Member
      • Jul 2020
      • 1607

      #32
      Originally posted by Paradigm Shifter View Post
      It's all about the power draw for me, both at home and at work. I've got a limited power budget to work with for both. I don't need 24GB of VRAM for games so a 5070 or 5070Ti might be appealing (depending on markup here), but for work I'm stuck firmly in the "need 48GB+ GPUs" niche. And dual-slot is non-negotiable: When I have systems I need to stuff 10 GPUs in, quad-slot monsters are not an option. Let's see what the prices are for the Quadros.
      The 5090 Founders Edition goes back to a dual slot design which is pretty impressive given the performance.

      Comment

      • mdedetrich
        Senior Member
        • Nov 2019
        • 2550

        #33
        Originally posted by kurkosdr View Post

        How long until Nvidia stops mentioning DLSS-free values (we are already almost there). And how long until you can't turn off DLSS at all so that independent reviewers cannot measure DLSS-free values?
        At least if you want to put reasonable limits on price and power usage, we are starting to hit diminishing returns for raw raster performance. This is due to a combination of hitting physical limits with chip density (newer nodes are starting to hit the size of an atom) and also the price of these wafers, which as pointed out earlier are magintitudes more expensive even when inflation adjusted.

        Jenson is not wrong here, we have to accept that its not like the earlier 2000's where we were getting massive jumps in raster performance generation to generation at the same cost/power draw, those days are over.

        Comment

        • bnolsen
          Senior Member
          • Mar 2008
          • 276

          #34
          Originally posted by ssokolow View Post
          No thanks. I bought an RTX 3060 during Cyber Monday 2022 to replace my GTX 750 and it's already work enough managing the heat output from a 170W card that I tend to bandwidth-bottleneck at 125W according to nvidia-smi.
          Compared to the rx5700xt cards I have the 12GB 3060 runs fairly cold. I really dislike nvidia but their lower end cards seem to be much better perf/watt compared to what amd is offering.

          The biggest problem I see are that game developers will use frame generation to bolster their laziness and further alienate those of us unwilling to be robbed by nvidia (and amd, etc).
          Last edited by bnolsen; 07 January 2025, 10:27 AM.

          Comment

          • creative
            Senior Member
            • Mar 2017
            • 870

            #35
            Originally posted by ms178 View Post
            People are getting way too much hyped up from inflated numbers that were achieved with the help of frame generation in specific circumstances in selected titles. Let's wait for third party reviews, let's see how many games actually support the new cool tech and what street prices the cards launch at, then we will have all the relevant data to evaluate these cards. The same is true for RDNA4. The near silence of AMD about it stunned me. I hope they will need to be more aggressive on pricing finally to make a splash. Or people will simply buy the 5070 instead.
            A quick look at Moores Law is Dead youtube channel, Hardware Unboxed etc. Will tell you a different story. I think Blackwell at the current moment is relying too much on AI for its performance. Only time will really reveil what's on the table for potential buyers. Lets not forget the inflated pricing from AIB's, scalper market, AI hardware buy ups.

            I imagine maybe close to two years out people will be able to have AIB models at actual close to MSRP, that's non founders MSRP if they are lucky. Most people won't be able to get founders editions, that's my guess. That 4090 performance is only going to apply to games that will implement DLSS 4 for the RTX 5070, has NVIDIA ironed out the input lag concerning FG? Time will tell but I doubt it. Take CES with a full canister of salt.

            You are already on the same page concerning being realistic, just thought I'd throw in my own take.
            Last edited by creative; 07 January 2025, 10:47 AM.

            Comment

            • middy
              Senior Member
              • Sep 2013
              • 232

              #36
              Originally posted by sophisticles View Post

              To put these prices into perspective, the Geforce 2 Ultra launched in 2000 at a price of $500 which is about $916 today.

              The Geforce 3 launched in 2001 at a price of $500 which is about $891 today.

              Both of these cards where the top of the line offering at the time.

              The RTX 4090 launched at $1600 two years ago.

              The RTX 5070 price of $549 for a card that can match a RTX 4090 is not bad at all.

              Of course i would never spend this much money on a video card unless I was using it to make money, so...
              that "5070 matching a 5090" claim is only with dlss 4 (and ONLY works on 5000 series cards so at most, only a few games will support it at release, good luck expecting wide spread adoption anytime soon) activated. up-scaling to match native performance isn't anything to brag about, nor be happy about.

              these prices are pure garbage. i don't understand your mental gymnastics at coping with Nvidia greed here. a top of the line product use to be $800-$900. round it off to $1,000 vs $2,000 today. we don't even need to go back that far to being with. if we go back just 10 years, the gtx 970 was a whopping $270 - $350. that's $364 - $472 today. vs $549 - $749 for the same tier (70 series) of cards today. absolute, pure, garabge.

              it doesn't matter what the "performance" is. what matters is how much the new generation for the same tier costs and its freakin ridiculous. if "AI" is the cause of this then nuke the data centers. not worth it.
              Last edited by middy; 07 January 2025, 11:50 AM.

              Comment

              • MorrisS.
                Senior Member
                • Feb 2022
                • 656

                #37
                No evidence about entry level Nvidia VGAs so far.

                Comment

                • sophisticles
                  Senior Member
                  • Dec 2015
                  • 2617

                  #38
                  Originally posted by Paradigm Shifter View Post
                  And as I've no idea about salary inflation in the US, sophisticles; have salaries risen commensurate with the figures you quote? I know in the UK, EU and Japan that salaries have nowhere near kept in sync with the general rate of inflation, so while the "relative" prices might not seem so extreme, actual affordability has (relatively) dropped.
                  Same in the U.S.

                  Back in the 90's before the tech bubble burst Java programmers where commanding 90 grand a year, a Porsche 944 turbo cost 45 grand, a Porsche 911 turbo cost 60 grand and a Ferrari Mondial cost 70 grand.

                  The days when a computer programmer could buy two brand new Porsches a year in cash are long gone.

                  Comment

                  • bachchain
                    Senior Member
                    • Jun 2016
                    • 403

                    #39
                    Don't worry. Before you know it, game devs will find new and innovative ways to invalidate any and all hardware improvements this two-thousand dollar space heater brings while having worse graphics than ten years ago.

                    Comment

                    • cjcox
                      Senior Member
                      • Nov 2007
                      • 506

                      #40
                      Code:
                      If you're unhappy and you know it, send him cash.
                      If you're unhappy and you know it, send him cash.
                      If you're unhappy and you know it, your crappy GPU will show it.
                      If you're unhappy and you know it, send him cash.

                      Comment

                      Working...
                      X