Originally posted by sobrus
View Post
NVIDIA Announces The GeForce RTX 50 "Blackwell" Series
Collapse
X
-
-
-
While we haven't seen any benchmarks for a large sample size of games with Blackwell, so far I'm feeling it's kind of meh. Don't know if NVIDIA is going to neuter their architecture in the future concerning raw rasterized performance. While RT and RTGI can be appreciated in certain repsects, the whole FG thing as far as my limited experience, it doesn't seem all that great. I think special compression techniques for textures is a good idea for GPU's with limited VRAM it's not going to solve issues unless it's automatic within the architecture, meaning it happens with some legacy games that can use massive texture packs with limited VRAM GPU's.
New GPU technologies aren't going to correct bad game design and development that's for sure. Stuff like UE5 isn't helping the matter with it's widespread adoption, even consoles suffer traversal stuter and horrible TAA implimentation. Some of the DLSS implementations are looking better than native in a number of games with bad TAA. I've seen bad implementations of DLSS as well, the Dead Space remake is a good
example.
When I revisit an old console game ported to PC, made in 2009. F.E.A.R. 2 Project Origin, with it's highly stylized graphics that look very smooth, well balanced and detailed, playing that game were weapon feedback and overall game design just feels refreshing campared to modern titles, it proves how games have regressed. F.E.A.R 2 Project Origin is a harped on game in the series, yet playing the game just feels excellent, a very cohesive sneak shooter that is both very responsive and feels well designed against alot of the new thats out there. Very impressed how well the game has aged while looking very modern.Last edited by creative; 07 January 2025, 04:46 PM.
Comment
-
-
Originally posted by bnolsen View PostCompared to the rx5700xt cards I have the 12GB 3060 runs fairly cold. I really dislike nvidia but their lower end cards seem to be much better perf/watt compared to what amd is offering.
Comment
-
-
Originally posted by pWe00Iri3e7Z9lHOX2Qx View Post
The 5090 Founders Edition goes back to a dual slot design which is pretty impressive given the performance.
Comment
-
-
Originally posted by sophisticles View Post
Same in the U.S.
Back in the 90's before the tech bubble burst Java programmers where commanding 90 grand a year, a Porsche 944 turbo cost 45 grand, a Porsche 911 turbo cost 60 grand and a Ferrari Mondial cost 70 grand.
The days when a computer programmer could buy two brand new Porsches a year in cash are long gone.
Comment
-
-
As someone here said, don't fall for NVIDIA marketing. 5070 will be about half raw performance of 4090, only "on pair" with it thanks to inserted 3 fake DLSSv4 frames instead of one for 4090, at the cost of increased input latency.
Also some comments on latest Digital Foundry NVIDIA video piece are hilarious:
"It's amazing how all it takes for DF to finally recognize the cons of DLSS is for another new shiny DLSS tech to release. Yesterday DLSS and TAA was perfect, ghosting was created today, obviously"
Comment
-
-
Originally posted by reavertm View PostAs someone here said, don't fall for NVIDIA marketing. 5070 will be about half raw performance of 4090, only "on pair" with it thanks to inserted 3 fake DLSSv4 frames instead of one for 4090, at the cost of increased input latency.
Also some comments on latest Digital Foundry NVIDIA video piece are hilarious:
"It's amazing how all it takes for DF to finally recognize the cons of DLSS is for another new shiny DLSS tech to release. Yesterday DLSS and TAA was perfect, ghosting was created today, obviously"
Going to be honest as a RTX 4070 Ti Super owner, I've actually grown more impressed with what AMD is doing, the 7900XT is impressive with what it has done with raw raster and it's improvements in 4K. The card I have is still a good GPU but if AMD keeps it up, NVIDIA is going to have a lot to contend with. I personally feel NVIDIA is starting to neuter their GPU's raw performance and focus on AI too much. AI has merit with GPU's concerning games but I feel NVIDIA is using and leveraging it as a catch all for bad game design and lazy development. Rather, I'd like to see modern games critized more due to poor art direction and mismanagement in overall game design involving politics that should be kept out of games.
What's kept me from AMD GPU's is a personal bad experience with one, luck of the draw. For some people there's still driver issues, enough to be cautious of buying from AMD. That being said, even with NVIDIA I found myself returning and swapping a Gigabyte RTX 4070 Ti Super for very strange issues interacting with not one but two high end PSU's, first was a Corsair RM850X, second was a Corsair RM1000X, that GPU also developed screeching coil whine, the type that had me thinking it was actually coming through my speakers, needless to say that GPU got exchanged for an Asus one when all had been said and done.
At this rate it's going to eventually be very difficult to pass up AMD in the future cause they are offering more for less. Generation on generation they have improved as well. If they can correct driver timeouts and a nvidia-settings like application is developed for AMD GPU's for GNU Linux AMD will be a home run with it's already better support across multiple desktop environments with multi head display setups, NVIDIA seems limited in it's support there being limited to GNOME from what I've gathered, I may be wrong though. I've only used one display at a time but I've heard plenty of people complain about NVIDIA's bad support for multi monitor setups in GNU Linux, it could be overblown and not the real case though for a good number of users.
AMD seems to have idle power consumption issues that needed end user intervention, this should have been resolved by now. I've seen at least two generations of complaints concerning that particular issue, but if that is an easy to fix end user issue I guess it's not too terrible.
Both AMD and NVIDIA GPU's have their issues but CES was some outright deceptive marketing 'despite saying due to DLSS4'. Why? There is no way hell a 5070 could even remotely touch a 4090 in raw raster performance. At best it's closing up to a mere RTX 4070 Ti Supers even then that 5070 has a shit 12GB of VRAM. My ask of modern GPU's is that if the card is $400 or more? Give that damn thing at least 16GB of VRAM, I don't care how fast GDDR 7 is, it's not going to make up for only having 12GB. Remember when the GTX 1070 matched or beat the raw peformance of the previous generations Titan X? That's no longer the case anymore for 70's class cards.
Had my RTX 3070 had more VRAM I would have kept it longer and that 3070 was a very decent card in terms of performance that got hamstringed by 8GB.Last edited by creative; 09 January 2025, 09:02 AM.
Comment
-
-
Originally posted by mdedetrich View Post
At least if you want to put reasonable limits on price and power usage, we are starting to hit diminishing returns for raw raster performance. This is due to a combination of hitting physical limits with chip density (newer nodes are starting to hit the size of an atom) and also the price of these wafers, which as pointed out earlier are magintitudes more expensive even when inflation adjusted.
Jenson is not wrong here, we have to accept that its not like the earlier 2000's where we were getting massive jumps in raster performance generation to generation at the same cost/power draw, those days are over.
Comment
-
Comment