Originally posted by Dukenukemx
View Post
Announcement
Collapse
No announcement yet.
AMD Announces Ryzen 7000 Series "Zen 4" Desktop CPUs - Linux Benchmarks To Come
Collapse
X
-
-
Originally posted by Dukenukemx View PostAlso, I don't care for FSR or DLSS as these technologies are faking graphics.
I just think it's funny how sanctimonious people get about DLSS. If it looks good enough, it is good enough. IMO, it's as simple as that. If you find it to have bothersome artifacts, then it doesn't look good enough and don't use it. However, for those who think it does look good enough, then its efficiency benefits are undeniable.
Originally posted by Dukenukemx View PostThe most I would consider buying is a 6800 XT and a RTX 3080.
- Likes 1
Leave a comment:
-
Originally posted by coder View PostUh, not that long, right? Only since RDNA, because it's architecturally more efficient than GCN and because AMD was on 7 nm, while Nvidia was still on 12 nm. Then, Nvidia made a quesitionable decision to move their mainstream onto Samsung 8 nm, although that's probably because they couldn't get the necessary volume on TSMC 7 nm.
Leave a comment:
-
Originally posted by qarium View Post
man there was a typo 2090.. and there is no 2090 so it is clear that it means 3090...
so what is the point of the 6900XT is just more efficient because of low clock speed and low-TDP power limit?
and the 6950XT clearly lose all efficiency advantages compared to the 3090... because of higher clocks and higher power limit.
and userbenchmark claims the 3090 is +27% faster than the 6950xt...
https://gpu.userbenchmark.com/Compar...4081vsm1843533
also just remember i have a vega64 and i would buy the 6950XT because i like opensource drivers...
but i would not claim the 6950XT on 7nm beats a nvidia 3090 on 8nm... as soon there is FSR2.0/DLSS2.x and raytracing in the game...
looks like on the list of games who supports this natively Nvidia is the clear winner:
https://www.pcgamingwiki.com/wiki/Li...lity_upscaling
you know what is the joke about the word earlier in your sentense ? the joke is: earlier AMD had no solution at all.
FSR1.0 was years later and FSR2.0 was even another year later.
Leave a comment:
-
Originally posted by Dukenukemx View PostIf AMD wanted to, they could release a graphics card with GDDR6 with 384-bit bus or higher and probably become the faster GPU.
Originally posted by Dukenukemx View PostNvidia has long ago lost the efficiency crown to AMD.
- Likes 1
Leave a comment:
-
Originally posted by Anux View Postsomething doesn't add up there, linux market share is 1% AMDs is 20%. I'm not sure why Nvidia is still that dominant because their most sold cards are ones that don't support RT or are much to slow for RT, that rules RT out as it's selling point. I guess it's just like with Intel, consumers need time to realize AMD is competitive again.
Its even much more efficient than the 3090 and yes all 6*50 models are overclocked and have faster RAM making them less efficient but the 6950XT is still more efficient than 3090ti. https://www.techpowerup.com/review/m...x-trio/37.html
Leave a comment:
-
Originally posted by NM64 View PostThis leads farther credence of desktop Ryzen being more of a "trickle down" from server than server being a "trickle up" from desktop.
This makes it very interesting to see how high Zen4 is reportedly clocking. I wonder how much potential IPC they sacrificed, in order to achieve that.
- Likes 1
Leave a comment:
-
Originally posted by NM64 View PostHeck, there is even AM4 server hardware now as seen by the following being from Asrock Rack rather than plain-old Asrock:
https://www.asrockrack.com/general/p...?Model=X470D4U
BTW, I'm still waiting for availability and prices to improve on this board's successor, the X570D4U (preferably the 10 Gig version). I've had my eye on it pretty much since it launched, but foolishly assumed its price would drop since then, yet it has only gone up. At least I can again find them in stock, recently.
Leave a comment:
-
Originally posted by Anux View PostWhat does that have to do with process node?
if you count in the fact that if you compare the 6950xt instead of 6900xt amd has no real efficiency advantage...
the 6900xt only has lower power consumion because of low clock and low power TDP maximum set.
this means this is still true:
nvidia on 8nm > amd on 7nm > intel on 6nm
this means even if intel has 3nm or 4nm or 5nm wavers from TSMC they can still lose.
Originally posted by Anux View Postsomething doesn't add up there, linux market share is 1% AMDs is 20%. I'm not sure why Nvidia is still that dominant because their most sold cards are ones that don't support RT or are much to slow for RT, that rules RT out as it's selling point. I guess it's just like with Intel, consumers need time to realize AMD is competitive again.
normal people just watch these websites:
Produktvergleich für Zotac Gaming GeForce RTX 3090 Trinity OC, 24GB GDDR6X, HDMI, 3x DP (ZT-A30900J-10P), Sapphire Nitro+ Radeon RX 6950 XT, 16GB GDDR6, HDMI, 3x DP, lite retail (11317-02-20G)
as soon as someone want 4K/5K resolution and FSR/DLSS and raytracing they will buy the 3090.
even if you say raytracing in gaming is just an impossibility people will still claim they can use the raytracing units for OptiX and will still go with nvidia.
also ROCm/HIP is just an replacement for Cuda and optiX is still faster and also there are 1000 times more CUDA software than RocM/HIP ready software.
"consumers need time to realize AMD is competitive again"
outside of linux/opensource AMD is only competitive in the FPS per Dollar game. but i did send you a comparison the different between 6950xt and 3090 is 228€
on linux opensource drivers is the strongend selling point.
Originally posted by Anux View Postthat could very well be the shitty drivers
Originally posted by Anux View PostIts even much more efficient than the 3090 and yes all 6*50 models are overclocked and have faster RAM making them less efficient but the 6950XT is still more efficient than 3090ti. https://www.techpowerup.com/review/m...x-trio/37.html
and i am sure that the 3090 beats the 6950xt in effiency in 4/5K + FSR/DLSS+raytracing benchmarks.
as soon as you do this nvidia is 27% faster.
if the 6900xt beats the 3090 in efficiency without raytracing and without DLSS some people like you care most people don't care.
Originally posted by Anux View Post
Yes and those that don't have dlss can still use RSR, so it's better to compare native and not skew the results with upscalers.
Leave a comment:
-
Originally posted by Dukenukemx View PostAnux said 3090, not 2080. There's a big difference.
so what is the point of the 6900XT is just more efficient because of low clock speed and low-TDP power limit?
and the 6950XT clearly lose all efficiency advantages compared to the 3090... because of higher clocks and higher power limit.
and userbenchmark claims the 3090 is +27% faster than the 6950xt...
https://gpu.userbenchmark.com/Compar...4081vsm1843533
also just remember i have a vega64 and i would buy the 6950XT because i like opensource drivers...
but i would not claim the 6950XT on 7nm beats a nvidia 3090 on 8nm... as soon there is FSR2.0/DLSS2.x and raytracing in the game...
so for me it still stands: nvidia on 8nm > amd on 7nm > intel on 6nm...
this means intel can still lose even if they have tonns of 3nm/4nm/5nm TSMC wavers...
intel is very good in producing bullshit.
Originally posted by Dukenukemx View PostBetter than nothing. World of Warcraft surprisingly supports FSR.
https://www.pcgamingwiki.com/wiki/Li...lity_upscaling
but again i would buy the 6950XT and i don't care what nvidia does or not does
but i know as soon as you benchmark 4K or 5K and you put in FSR/DLSS and Raytraing in the benchmark Nvidia wins.
so technically for me this is true: nvidia on 8nm > amd on 7nm > intel on 6nm.
this means intel can do a lot of bullshit even if they have 3nm 4nm 5nm. node.
Originally posted by Dukenukemx View Post
FSR1.0 was years later and FSR2.0 was even another year later.Last edited by qarium; 06 September 2022, 10:41 PM.
Leave a comment:
Leave a comment: