Announcement

Collapse
No announcement yet.

AMD Announces Ryzen 7000 Series "Zen 4" Desktop CPUs - Linux Benchmarks To Come

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • qarium
    replied
    Originally posted by Dukenukemx View Post
    This has been a problem for a while as Nvidia's "Mind Share" is greater than AMD's. Basically, most people believe Nvidia has the superior product due to drivers and features.
    It should also be noted that AMD's RDNA2 cards are not only more power efficient than Ampere, but also only utilize GDDR6 memory, and not GDDR6X like the 3080, 3080Ti, and 3090 all utilize. The 6950XT also uses a 256-bit memory bus, compared to Nvidia's 3080/3080Ti/3090's 384-bit bus. If AMD wanted to, they could release a graphics card with GDDR6 with 384-bit bus or higher and probably become the faster GPU. Nvidia has long ago lost the efficiency crown to AMD.
    if you do not exclude any feature and you put all the features in the benchmark means DLSS2 and also raytracing Nvidia still has the efficiency cown. amd only wins if you turn of raytracing and if you do not benchmark with FSR2/DLSS2

    Leave a comment:


  • coder
    replied
    Originally posted by Dukenukemx View Post
    Also, I don't care for FSR or DLSS as these technologies are faking graphics.
    At some level, all computer graphics is faking, right? If you really care about "authenticity" in your algorithms, then you should be using only Path Tracing, and nothing else. And insist on no postprocessing effects.

    I just think it's funny how sanctimonious people get about DLSS. If it looks good enough, it is good enough. IMO, it's as simple as that. If you find it to have bothersome artifacts, then it doesn't look good enough and don't use it. However, for those who think it does look good enough, then its efficiency benefits are undeniable.

    Originally posted by Dukenukemx View Post
    The most I would consider buying is a 6800 XT and a RTX 3080.
    I'm going to start looking a lot more seriously at a RTX 3080, if prices keep going down. I would consider a 12 GB version, if I can get one at a discount relative to the comparable-performing RTX 4000 series card. That's how I ended up with my current GPU, which I got for a little over 50% below MSRP once the GTX 1000-series launched.

    Leave a comment:


  • Dukenukemx
    replied
    Originally posted by coder View Post
    Uh, not that long, right? Only since RDNA, because it's architecturally more efficient than GCN and because AMD was on 7 nm, while Nvidia was still on 12 nm. Then, Nvidia made a quesitionable decision to move their mainstream onto Samsung 8 nm, although that's probably because they couldn't get the necessary volume on TSMC 7 nm.
    RDNA is about 3 years old, so to me that's a while ago. Also from what I understand the move to Samsung's 8nm was cost as Samsung was offering it for a much lower price compared to TSMC.

    Leave a comment:


  • Dukenukemx
    replied
    Originally posted by qarium View Post

    man there was a typo 2090.. and there is no 2090 so it is clear that it means 3090...
    You tend to move the goal post so often that it's hard to tell.
    so what is the point of the 6900XT is just more efficient because of low clock speed and low-TDP power limit?

    and the 6950XT clearly lose all efficiency advantages compared to the 3090... because of higher clocks and higher power limit.

    and userbenchmark claims the 3090 is +27% faster than the 6950xt...
    https://gpu.userbenchmark.com/Compar...4081vsm1843533
    If the 6950XT uses less power than it uses less power. Also, the amount of power needed to feed the 3090 is scary. We can all joke how demanding a R9 390 or Vega 64 is in power, but compared to a 3090, they seemed efficient, and less likely to start a fire. The 3090's and also the 3080's can spike a power supply really hard, to the point that it can do some serious damage.

    also just remember i have a vega64 and i would buy the 6950XT because i like opensource drivers...

    but i would not claim the 6950XT on 7nm beats a nvidia 3090 on 8nm... as soon there is FSR2.0/DLSS2.x and raytracing in the game...
    Nobody in their right mind should consider buying either. Both are horribly overpriced, and both demand a lot more power. The most I would consider buying is a 6800 XT and a RTX 3080. Also, I don't care for FSR or DLSS as these technologies are faking graphics. These are solutions to Ray-Tracing as most games can't run this feature without lowering the resolution. Because Ray-Tracing sucks and isn't a feature that games 100% utilize. You may see water or a reflection with Ray-Tracing, which is not the same as Ray-Tracing in Minecraft or Quake 2 which everything has Ray-Tracing. Minecraft and Quake 2 not particularly demanding games.

    looks like on the list of games who supports this natively Nvidia is the clear winner:
    https://www.pcgamingwiki.com/wiki/Li...lity_upscaling
    Link doesn't work. Doesn't matter anyway because nobody cares about Ray-Tracing.
    you know what is the joke about the word earlier in your sentense ? the joke is: earlier AMD had no solution at all.
    FSR1.0 was years later and FSR2.0 was even another year later.
    That's because these technologies aren't really worth using since they do lower image quality. You are always better off running in native 4k than fake 4k.

    Leave a comment:


  • coder
    replied
    Originally posted by Dukenukemx View Post
    If AMD wanted to, they could release a graphics card with GDDR6 with 384-bit bus or higher and probably become the faster GPU.
    Only if they scaled up the rest of the GPU, proportionately. The reason they get away with a narrower bus is the Inifinity Cache. That makes them less dependent on GDDR memory bandwidth.

    Originally posted by Dukenukemx View Post
    Nvidia has long ago lost the efficiency crown to AMD.
    Uh, not that long, right? Only since RDNA, because it's architecturally more efficient than GCN and because AMD was on 7 nm, while Nvidia was still on 12 nm. Then, Nvidia made a quesitionable decision to move their mainstream onto Samsung 8 nm, although that's probably because they couldn't get the necessary volume on TSMC 7 nm.

    Leave a comment:


  • Dukenukemx
    replied
    Originally posted by Anux View Post
    something doesn't add up there, linux market share is 1% AMDs is 20%. I'm not sure why Nvidia is​ still that dominant because their most sold cards are ones that don't support RT or are much to slow for RT, that rules RT out as it's selling point. I guess it's just like with Intel, consumers need time to realize AMD is competitive again.
    This has been a problem for a while as Nvidia's "Mind Share" is greater than AMD's. Basically, most people believe Nvidia has the superior product due to drivers and features.

    Its even much more efficient than the 3090 and yes all 6*50 models are overclocked and have faster RAM making them less efficient but the 6950XT is still more efficient than 3090ti. https://www.techpowerup.com/review/m...x-trio/37.html
    It should also be noted that AMD's RDNA2 cards are not only more power efficient than Ampere, but also only utilize GDDR6 memory, and not GDDR6X like the 3080, 3080Ti, and 3090 all utilize. The 6950XT also uses a 256-bit memory bus, compared to Nvidia's 3080/3080Ti/3090's 384-bit bus. If AMD wanted to, they could release a graphics card with GDDR6 with 384-bit bus or higher and probably become the faster GPU. Nvidia has long ago lost the efficiency crown to AMD.

    Leave a comment:


  • coder
    replied
    Originally posted by NM64 View Post
    This leads farther credence of desktop Ryzen being more of a "trickle down" from server than server being a "trickle up" from desktop.
    This is definitely the case, if you compare the overall desktop non-APU market size vs. sever CPU market that AMD is going for. I think that has several implications, including as AMD's perf/W advantage over Intel, and even potentially explains why Zen2 and Zen3 CPUs haven't clocked as high (hint: server CPUs limit clock speeds, for better scalability).

    This makes it very interesting to see how high Zen4 is reportedly clocking. I wonder how much potential IPC they sacrificed, in order to achieve that.

    Leave a comment:


  • coder
    replied
    Originally posted by NM64 View Post
    Heck, there is even AM4 server hardware now as seen by the following being from Asrock Rack rather than plain-old Asrock:
    https://www.asrockrack.com/general/p...?Model=X470D4U
    ASRock Rack pretty much has the market cornered on AM4 server boards. That tells me it's a very niche market, and not something AMD thinks about very much.

    BTW, I'm still waiting for availability and prices to improve on this board's successor, the X570D4U (preferably the 10 Gig version). I've had my eye on it pretty much since it launched, but foolishly assumed its price would drop since then, yet it has only gone up. At least I can again find them in stock, recently.

    Leave a comment:


  • qarium
    replied
    Originally posted by Anux View Post
    What does that have to do with process node?

    if you count in the fact that if you compare the 6950xt instead of 6900xt amd has no real efficiency advantage...
    the 6900xt only has lower power consumion because of low clock and low power TDP maximum set.
    this means this is still true:

    nvidia on 8nm > amd on 7nm > intel on 6nm

    this means even if intel has 3nm or 4nm or 5nm wavers from TSMC they can still lose.

    Originally posted by Anux View Post
    something doesn't add up there, linux market share is 1% AMDs is 20%. I'm not sure why Nvidia is​ still that dominant because their most sold cards are ones that don't support RT or are much to slow for RT, that rules RT out as it's selling point. I guess it's just like with Intel, consumers need time to realize AMD is competitive again.

    normal people just watch these websites:


    Produktvergleich für Zotac Gaming GeForce RTX 3090 Trinity OC, 24GB GDDR6X, HDMI, 3x DP (ZT-A30900J-10P), Sapphire Nitro+ Radeon RX 6950 XT, 16GB GDDR6, HDMI, 3x DP, lite retail (11317-02-20G)








    as soon as someone want 4K/5K resolution and FSR/DLSS and raytracing they will buy the 3090.

    even if you say raytracing in gaming is just an impossibility people will still claim they can use the raytracing units for OptiX and will still go with nvidia.

    also ROCm/HIP is just an replacement for Cuda and optiX is still faster and also there are 1000 times more CUDA software than RocM/HIP ready software.

    "consumers need time to realize AMD is competitive again"

    outside of linux/opensource AMD is only competitive in the FPS per Dollar game. but i did send you a comparison the different between 6950xt and 3090 is 228€

    on linux opensource drivers is the strongend selling point.

    Originally posted by Anux View Post
    that could very well be the shitty drivers
    the consumer don't care for the reason why this happens. they see 6nm intel and it is shit.

    ​​
    Originally posted by Anux View Post
    Its even much more efficient than the 3090 and yes all 6*50 models are overclocked and have faster RAM making them less efficient but the 6950XT is still more efficient than 3090ti. https://www.techpowerup.com/review/m...x-trio/37.html
    i am sure the consumer don't care about a fair comparison without raytracing and without DLSS...

    and i am sure that the 3090 beats the 6950xt in effiency in 4/5K + FSR/DLSS+raytracing benchmarks.


    as soon as you do this nvidia is 27% faster.

    if the 6900xt beats the 3090 in efficiency without raytracing and without DLSS some people like you care most people don't care.

    ​​
    Originally posted by Anux View Post

    Yes and those that don't have dlss can still use RSR, so it's better to compare native and not skew the results with upscalers.
    Nvidia with 8nm beats AMD with 7nm if you put in all features... only people like you who exclude features like raytracing and dlss come to a different conclusion.

    Leave a comment:


  • qarium
    replied
    Originally posted by Dukenukemx View Post
    Anux said 3090, not 2080. There's a big difference.
    man there was a typo 2090.. and there is no 2090 so it is clear that it means 3090...

    so what is the point of the 6900XT is just more efficient because of low clock speed and low-TDP power limit?

    and the 6950XT clearly lose all efficiency advantages compared to the 3090... because of higher clocks and higher power limit.

    and userbenchmark claims the 3090 is +27% faster than the 6950xt...
    https://gpu.userbenchmark.com/Compar...4081vsm1843533

    also just remember i have a vega64 and i would buy the 6950XT because i like opensource drivers...

    but i would not claim the 6950XT on 7nm beats a nvidia 3090 on 8nm... as soon there is FSR2.0/DLSS2.x and raytracing in the game...

    so for me it still stands: nvidia on 8nm > amd on 7nm > intel on 6nm...

    this means intel can still lose even if they have tonns of 3nm/4nm/5nm TSMC wavers...

    intel is very good in producing bullshit.

    Originally posted by Dukenukemx View Post
    Better than nothing. World of Warcraft surprisingly supports FSR.
    looks like on the list of games who supports this natively Nvidia is the clear winner:
    https://www.pcgamingwiki.com/wiki/Li...lity_upscaling

    but again i would buy the 6950XT and i don't care what nvidia does or not does

    but i know as soon as you benchmark 4K or 5K and you put in FSR/DLSS and Raytraing in the benchmark Nvidia wins.

    so technically for me this is true: nvidia on 8nm > amd on 7nm > intel on 6nm.

    this means intel can do a lot of bullshit even if they have 3nm 4nm 5nm. node.

    Originally posted by Dukenukemx View Post
    Yea and DLSS is Nvidia exclusive plus earlier implementations kinda sucked.
    you know what is the joke about the word earlier in your sentense ? the joke is: earlier AMD had no solution at all.
    FSR1.0 was years later and FSR2.0 was even another year later.
    Last edited by qarium; 06 September 2022, 10:41 PM.

    Leave a comment:

Working...
X