NVIDIA Announces The GeForce RTX 50 "Blackwell" Series

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • kurkosdr
    Senior Member
    • Jul 2013
    • 163

    #61
    Originally posted by qarium View Post

    do we even need this raster performance in time of machine learning based 3D graphic engines ?

    you can use a raster 640x480 pixel input for the AI graphics engine and the result will be 4K with full raytracing lighting and high quality textures generated by AI...
    You can't read information when it's not there. You can fill it in with arbitrary stuff (which is what AI does) but should graphics go toward that route? What fidelity do you have in this scenario?

    Comment

    • creative
      Senior Member
      • Mar 2017
      • 870

      #62
      Originally posted by qarium View Post

      first of all i agree with you DLSS and RTX is overrated if in the end you play without it to avoid its downside.

      no i have a question what do you think about ML based FSR4 ? upscalling and Temporal anti-aliasing
      and even Frame generation and now in FSR4 this not filter-algorithm based on shaders now its machine learning based on AI units isn't this not just copy Nvidia DLSS3/4 ?

      isn't AMD RadeonAnti-Lag 2 not a copy of nvidia anti-lag ?

      and RDNA4 it looks like AMD did not improve rasterization graphic's much instead AMD spend the transistors to improve the Raytracing performance...

      isn't it foolish to praise AMD and Intel if AMD and Intel copy nvidia in nearly every aspect ?

      yes the price is lower means the performance per dollar but they go on the DLSS and raytracing hype just as nvidia.

      and keep in mind i have a AMD PRO W7900 i pay double the price before i go out and buy a Nvidia RTX 4090

      and i will buy the AMD Radeon 9070XT as well for neighbor's children i do manage the IT hardware for them.

      so i will not buy nvidia but it really sounds foolish to blame DLSS and RTX in time AMD and Intel Copy DLSS and RTX...
      It's sort of like the cellphone market, Apple being the golden child and every manufacturor for android following its pitiful wake, now all android phones are missing 3.5mm audio jacks and use usb-c. I think it's stupid that every company follows like that, it's an admittance to self perceived infiorority. Oh well Samsung sucks anyway, now Motorola is 'the brand' again.

      Having owned at least two generations of RTX I can actually say all the RTX technologies are not killer must haves, instead I like to say all these RTX features are features that we shouldn't have. Let me reiterate, we shouldn't have to have RTX technologies, NVIDIA has become an enabler for sloppy game development and continuously bad game design.
      Last edited by creative; 09 January 2025, 09:41 AM.

      Comment

      • ssokolow
        Senior Member
        • Nov 2013
        • 5096

        #63
        Originally posted by creative View Post
        It's sort of like the cellphone market, Apple being the golden child and every manufacturor for android following its pitiful wake, now all android phones are missing 3.5mm audio jacks and use usb-c. I think it's stupid that every company follows like that, it's an admittance to self perceived infiorority. Oh well Samsung sucks anyway, now Motorola is 'the brand' again.
        Apple isn't at fault for USB-C. Blame the EU's law about reducing eWaste by ensuring interoperable chargers and charging cables for that.

        Same reason Apple switched to USB-C for iPhone 15 and up. The exception they managed to wrangle ran out.

        Comment

        • creative
          Senior Member
          • Mar 2017
          • 870

          #64
          ssokolow Oh yeah you're right, I forgot about that.

          Comment

          • rmfx
            Senior Member
            • Jan 2019
            • 757

            #65
            Soon in the hands of scalpers....
            When I bought my 4090, I had to go through crazy processes to get one founder edition one at the official price.
            That was ridiculous

            Comment

            • qarium
              Senior Member
              • Nov 2008
              • 3438

              #66
              Originally posted by ssokolow View Post
              What you wrote, as phrased, only makes sense if you missed the "far away" in "What we need is more cutting edge Fabs, preferably far away from China." and interpreted it as "preferably sourced from China" instead of the intended "preferably NOT sourced from China".
              ...well, unless you felt such a strong need to rant about China that you spat out an irrelevant ramble when all that was needed was "Don't worry. China is behind and shows no signs of catching up for a long time."
              the fear that a product is coming from china and should not come from china and should not even come from a country like taiwan who is near china is nonsense.

              the china fear scam is complete stupid nonsense.
              the only reason why we do not get cheap 2nm or 3nm or 4nm or 5nm chips is because there is no competition

              if china would had such technology they would flood the market with dirt cheap chips.

              but the best they have is a broken DUV quadruple patterning 5nm who can only produce very small chips and they have yild rates below 40% means its total garbage the costs are very high and also quadruple patterning is very slow to produce...

              "Don't worry. China is behind and shows no signs of catching up for a long time."

              exactly china is so much behind in that tech sector they need like 10 years to catch up to current status of fabrication nodes.

              but you fear that evil china will invade Taiwan and the Taiwan themself make laws who outlaw to build any TSMC fabrication node factor with the newest generation outside of Taiwan means its against the law for TSMC to build 2nm/3nm factories outside of Taiwan::::

              this law alone shows they fear competition from USA more than they fear china.
              Phantom circuit Sequence Reducer Dyslexia

              Comment

              • qarium
                Senior Member
                • Nov 2008
                • 3438

                #67
                Originally posted by theriddick View Post
                Many not realizing the performance matching is done with Frame Gen 4x etc...
                Imaging having to be connected to a massive data farm in order to play a video game in the future because that's how it processes your frames...
                some people will believe this "Frame Gen 4x"...

                the real news is that AMD said they will back port FSR4.0 to selected RDNA3 Radeon 7000 modells...

                they say some weak cards like radeon 7600 maybe are not able to accelerate FSR4.0 because of the lag of calculation power of their AI cores ... these GPUs maybe have Ai cores but not enough for that tech.

                but cards like 7900XTX and 7900XT and 7900GRE and 7800XT and so one will get FSR4.0 support.

                and the image quality between FSR3.1 compared to FSR4.0 is really a impressive improvement.
                Phantom circuit Sequence Reducer Dyslexia

                Comment

                • qarium
                  Senior Member
                  • Nov 2008
                  • 3438

                  #68
                  Originally posted by kurkosdr View Post
                  You can't read information when it's not there. You can fill it in with arbitrary stuff (which is what AI does) but should graphics go toward that route? What fidelity do you have in this scenario?
                  as i already said these machine learning graphic engines do exactly what you say "fill it in with arbitrary stuff"
                  but is clearly tuned to be good looking...

                  "should graphics go toward that route?"

                  people honestly do not care if the result is very good looking...

                  this means yes the answer is yes... in future we will have fake-raster game engines who output 1360x768
                  and the machine learning graphic engins will blow it up to 4k with raytracing like lighting and high-quality AI generated textures and effects...
                  Phantom circuit Sequence Reducer Dyslexia

                  Comment

                  • qarium
                    Senior Member
                    • Nov 2008
                    • 3438

                    #69
                    Originally posted by creative View Post
                    It's sort of like the cellphone market, Apple being the golden child and every manufacturor for android following its pitiful wake, now all android phones are missing 3.5mm audio jacks and use usb-c. I think it's stupid that every company follows like that, it's an admittance to self perceived infiorority. Oh well Samsung sucks anyway, now Motorola is 'the brand' again.
                    Having owned at least two generations of RTX I can actually say all the RTX technologies are not killer must haves, instead I like to say all these RTX features are features that we shouldn't have. Let me reiterate, we shouldn't have to have RTX technologies, NVIDIA has become an enabler for sloppy game development and continuously bad game design.
                    well... you are right... but see i did never buy any nvidia RTX product... and its not that i do not spend money on GPU hardware my AMD pro w7900 is much more expensive than a 4090 or even than a 5090...

                    to be honest i see people like you who buy multiple RTX cards and then say it was not worth it as victims of nvidia mind control advertisement.

                    see AMD will backport FSR4.0 to the RDNA3 cards like my AMD PRO W7900 and in the long run the 48gb vram will make this card more future proof than a 4090 this is my opinion because larger AI modells only fit in 48gb vram cards and do not fit in 24gb vram cards.
                    Phantom circuit Sequence Reducer Dyslexia

                    Comment

                    • creative
                      Senior Member
                      • Mar 2017
                      • 870

                      #70
                      Originally posted by qarium View Post
                      to be honest i see people like you who buy multiple RTX cards and then say it was not worth it as victims of nvidia mind control advertisement.
                      "victims of nvidia mind control advertisement"

                      I usually don't gripe like that, not in that way. I gripe about how pricy the stuff is, I just recently this year started complaining about their marketing mainly due to CES 2025 which was hilarious.

                      Oh it was worth it! It's just expensive as hell for what you get. Really good performance and great support with legacy games as well. When I see a 7900XTX running Dishonored 2 at 50 less fps under a 120 cap, that does say something, at least the ocean doesn't look buggy on a good number of those cards now. I've just not seen a RTX title that I thought was jaw dropping. Not the hugest fan of DLSS when I think games could be designed better at least it's better than native TAA in a number of places. One of the reasons I have kept with NVIDIA is due to there being less graphics bugs in older games. I had a 6800XT in a number of older titles that just ran terrible compared to a GTX 1070 which is way way slower.

                      Where the feeling like a victim comes in is how they short people on VRAM. I've only had two RTX cards. The latest one I have has at least what I think is decent enough amount of it. I had a 3070 for around a bit over 3 years so that wasn't actually that bad. My 1070 I had a little over four years before that even better, that card actually was usually on par or beat Maxwells Titan X. Stuff has changed. I have been using NVIDIA cards since the GeForce 2 GTS before that I used 3DFX. NVIDIA was the first GPU I had when starting out in GNU Linux as well.

                      My thing about them is they have shifted but that is what technology does. Nothing stays the same, some times is for the better, or worse. AMD is looking better and better for sure though.

                      You are also using a professional card so that's a different story, when your job demands a certain architectural profile and set of workloads it makes sense for your work to buy such an expensive GPU. Those actually seem to be the only GPU's Apple actually uses in their workstations now. I don't know enough about the W line of AMD gpus other than they are for workstations and are pretty expensive.

                      I'm just a geek gamer, I will say this, it is nice to have dual encoders on a GPU now. That's about as much work as my GPU does outside of gaming, encoding. Another reason for me getting a GPU with more VRAM is I've slowly over the years have been doing more and more video editing, while 8GB isn't bad 16GB is a lot better.

                      Generally when I buy a GPU I have a set of criteria a GPU needs to match, my current resolution, possible future resolution and how that GPU will handle that resolution in a number of titles. I don't really buy a GPU to play the newest games for the most part, though that is a pleasant after effect. I mainly got a RTX 4070 Ti Super for it's VRAM and it's ability to handle 4K should I choose a 32in display in the future for playing mostly older games plus a few newer ones here and there. I would not be surprised if this one lasts me notibly longer than my 1070 did. Why you might ask? I bought a tier above what I usually buy. I'm not impressed with newer games at all. It was cool seeing this thing chew through the new Indiana Jones game, not played much of that game, aspects of it looks nice but I simply find it boring!

                      My biggest gripe is that RTX 4070 Ti Super shouldn't be a $800 card, it should MSRP for around $600, which I feel is pretty realistic given how big NVIDIA's market share is.
                      Last edited by creative; 11 January 2025, 12:30 AM.

                      Comment

                      Working...
                      X