Announcement

Collapse
No announcement yet.

AMD Announces Ryzen 7000 Series "Zen 4" Desktop CPUs - Linux Benchmarks To Come

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by Anux View Post
    something doesn't add up there, linux market share is 1% AMDs is 20%. I'm not sure why Nvidia is​ still that dominant because their most sold cards are ones that don't support RT or are much to slow for RT, that rules RT out as it's selling point. I guess it's just like with Intel, consumers need time to realize AMD is competitive again.
    This has been a problem for a while as Nvidia's "Mind Share" is greater than AMD's. Basically, most people believe Nvidia has the superior product due to drivers and features.

    Its even much more efficient than the 3090 and yes all 6*50 models are overclocked and have faster RAM making them less efficient but the 6950XT is still more efficient than 3090ti. https://www.techpowerup.com/review/m...x-trio/37.html
    It should also be noted that AMD's RDNA2 cards are not only more power efficient than Ampere, but also only utilize GDDR6 memory, and not GDDR6X like the 3080, 3080Ti, and 3090 all utilize. The 6950XT also uses a 256-bit memory bus, compared to Nvidia's 3080/3080Ti/3090's 384-bit bus. If AMD wanted to, they could release a graphics card with GDDR6 with 384-bit bus or higher and probably become the faster GPU. Nvidia has long ago lost the efficiency crown to AMD.

    Comment


    • Originally posted by Dukenukemx View Post
      If AMD wanted to, they could release a graphics card with GDDR6 with 384-bit bus or higher and probably become the faster GPU.
      Only if they scaled up the rest of the GPU, proportionately. The reason they get away with a narrower bus is the Inifinity Cache. That makes them less dependent on GDDR memory bandwidth.

      Originally posted by Dukenukemx View Post
      Nvidia has long ago lost the efficiency crown to AMD.
      Uh, not that long, right? Only since RDNA, because it's architecturally more efficient than GCN and because AMD was on 7 nm, while Nvidia was still on 12 nm. Then, Nvidia made a quesitionable decision to move their mainstream onto Samsung 8 nm, although that's probably because they couldn't get the necessary volume on TSMC 7 nm.

      Comment


      • Originally posted by qarium View Post

        man there was a typo 2090.. and there is no 2090 so it is clear that it means 3090...
        You tend to move the goal post so often that it's hard to tell.
        so what is the point of the 6900XT is just more efficient because of low clock speed and low-TDP power limit?

        and the 6950XT clearly lose all efficiency advantages compared to the 3090... because of higher clocks and higher power limit.

        and userbenchmark claims the 3090 is +27% faster than the 6950xt...
        https://gpu.userbenchmark.com/Compar...4081vsm1843533
        If the 6950XT uses less power than it uses less power. Also, the amount of power needed to feed the 3090 is scary. We can all joke how demanding a R9 390 or Vega 64 is in power, but compared to a 3090, they seemed efficient, and less likely to start a fire. The 3090's and also the 3080's can spike a power supply really hard, to the point that it can do some serious damage.

        also just remember i have a vega64 and i would buy the 6950XT because i like opensource drivers...

        but i would not claim the 6950XT on 7nm beats a nvidia 3090 on 8nm... as soon there is FSR2.0/DLSS2.x and raytracing in the game...
        Nobody in their right mind should consider buying either. Both are horribly overpriced, and both demand a lot more power. The most I would consider buying is a 6800 XT and a RTX 3080. Also, I don't care for FSR or DLSS as these technologies are faking graphics. These are solutions to Ray-Tracing as most games can't run this feature without lowering the resolution. Because Ray-Tracing sucks and isn't a feature that games 100% utilize. You may see water or a reflection with Ray-Tracing, which is not the same as Ray-Tracing in Minecraft or Quake 2 which everything has Ray-Tracing. Minecraft and Quake 2 not particularly demanding games.

        looks like on the list of games who supports this natively Nvidia is the clear winner:
        https://www.pcgamingwiki.com/wiki/Li...lity_upscaling
        Link doesn't work. Doesn't matter anyway because nobody cares about Ray-Tracing.
        you know what is the joke about the word earlier in your sentense ? the joke is: earlier AMD had no solution at all.
        FSR1.0 was years later and FSR2.0 was even another year later.
        That's because these technologies aren't really worth using since they do lower image quality. You are always better off running in native 4k than fake 4k.

        Comment


        • Originally posted by coder View Post
          Uh, not that long, right? Only since RDNA, because it's architecturally more efficient than GCN and because AMD was on 7 nm, while Nvidia was still on 12 nm. Then, Nvidia made a quesitionable decision to move their mainstream onto Samsung 8 nm, although that's probably because they couldn't get the necessary volume on TSMC 7 nm.
          RDNA is about 3 years old, so to me that's a while ago. Also from what I understand the move to Samsung's 8nm was cost as Samsung was offering it for a much lower price compared to TSMC.

          Comment


          • Originally posted by Dukenukemx View Post
            Also, I don't care for FSR or DLSS as these technologies are faking graphics.
            At some level, all computer graphics is faking, right? If you really care about "authenticity" in your algorithms, then you should be using only Path Tracing, and nothing else. And insist on no postprocessing effects.

            I just think it's funny how sanctimonious people get about DLSS. If it looks good enough, it is good enough. IMO, it's as simple as that. If you find it to have bothersome artifacts, then it doesn't look good enough and don't use it. However, for those who think it does look good enough, then its efficiency benefits are undeniable.

            Originally posted by Dukenukemx View Post
            The most I would consider buying is a 6800 XT and a RTX 3080.
            I'm going to start looking a lot more seriously at a RTX 3080, if prices keep going down. I would consider a 12 GB version, if I can get one at a discount relative to the comparable-performing RTX 4000 series card. That's how I ended up with my current GPU, which I got for a little over 50% below MSRP once the GTX 1000-series launched.

            Comment


            • Originally posted by Dukenukemx View Post
              This has been a problem for a while as Nvidia's "Mind Share" is greater than AMD's. Basically, most people believe Nvidia has the superior product due to drivers and features.
              It should also be noted that AMD's RDNA2 cards are not only more power efficient than Ampere, but also only utilize GDDR6 memory, and not GDDR6X like the 3080, 3080Ti, and 3090 all utilize. The 6950XT also uses a 256-bit memory bus, compared to Nvidia's 3080/3080Ti/3090's 384-bit bus. If AMD wanted to, they could release a graphics card with GDDR6 with 384-bit bus or higher and probably become the faster GPU. Nvidia has long ago lost the efficiency crown to AMD.
              if you do not exclude any feature and you put all the features in the benchmark means DLSS2 and also raytracing Nvidia still has the efficiency cown. amd only wins if you turn of raytracing and if you do not benchmark with FSR2/DLSS2
              Phantom circuit Sequence Reducer Dyslexia

              Comment


              • Originally posted by Dukenukemx View Post
                If the 6950XT uses less power than it uses less power. Also, the amount of power needed to feed the 3090 is scary. We can all joke how demanding a R9 390 or Vega 64 is in power, but compared to a 3090, they seemed efficient, and less likely to start a fire. The 3090's and also the 3080's can spike a power supply really hard, to the point that it can do some serious damage.
                the most highend hardware did always use the most power.. and i dont care about a TDP of 340watt or 450watt
                i have a 2000watt PSU in my 2 threadripper systems. this means an insane TDP of 450watt for a card does not matter in my system.

                Originally posted by Dukenukemx View Post
                Nobody in their right mind should consider buying either. Both are horribly overpriced,

                i did observe a price drop of the 6950 from 1200€ to 950€ if you compare it to 6900 and 6800 it is not overpriced
                the 6950 was only overpriced at the release date... at the release date the 6900 was sold for 950€ and the 6950 was at 1200€
                but now the 6900 is at 885€ and the 6950 is at 958€
                thats not horrible overpriced... if you want to save power with an more efficient gpu you maybe choose the 6900 anyway..

                Originally posted by Dukenukemx View Post

                and both demand a lot more power. The most I would consider buying is a 6800 XT and a RTX 3080. Also, I don't care for FSR or DLSS as these technologies are faking graphics. These are solutions to Ray-Tracing as most games can't run this feature without lowering the resolution. Because Ray-Tracing sucks and isn't a feature that games 100% utilize. You may see water or a reflection with Ray-Tracing, which is not the same as Ray-Tracing in Minecraft or Quake 2 which everything has Ray-Tracing. Minecraft and Quake 2 not particularly demanding games.
                you for yourself can do whatever you want turn of raytracing and turn of FSR/DLSS you are also free to only buy a 6800 or 3080..
                but be aware of that people like you are a minority and even people like me who buy amd because of opensource drivers are a minority...

                the majority means the mainstream they just turn on all features no matter what FSR2/DLSS2 and Raytracing...

                Originally posted by Dukenukemx View Post

                Link doesn't work. Doesn't matter anyway because nobody cares about Ray-Tracing.
                the link works for me ? LOL,,,

                Originally posted by Dukenukemx View Post

                That's because these technologies aren't really worth using since they do lower image quality. You are always better off running in native 4k than fake 4k.
                i have a 4K60hz TV on my pc and my vega64 really needs FSR to maintain 4K without disabling to many features.

                native 4'K is maybe better if your system is fast enough but not for cards like vega64 who are to slow without FSR...
                Phantom circuit Sequence Reducer Dyslexia

                Comment


                • Originally posted by coder View Post
                  I'm going to start looking a lot more seriously at a RTX 3080, if prices keep going down. I would consider a 12 GB version, if I can get one at a discount relative to the comparable-performing RTX 4000 series card. That's how I ended up with my current GPU, which I got for a little over 50% below MSRP once the GTX 1000-series launched.
                  why not get the full 16gb vram with the 6950xt? the price did go down in the last weeks its 958€ now
                  Phantom circuit Sequence Reducer Dyslexia

                  Comment


                  • Originally posted by coder View Post
                    At some level, all computer graphics is faking, right?
                    I see what you did there.
                    If you really care about "authenticity" in your algorithms, then you should be using only Path Tracing, and nothing else. And insist on no postprocessing effects.
                    I just think it's funny how sanctimonious people get about DLSS. If it looks good enough, it is good enough. IMO, it's as simple as that. If you find it to have bothersome artifacts, then it doesn't look good enough and don't use it. However, for those who think it does look good enough, then its efficiency benefits are undeniable.
                    The problem with DLSS and FSR is that it doesn't have consistency in terms of results. This makes it harder to do comparisons between GPU brands as these companies have historically played with image quality to get better benchmark results. ATI has the Quake/Quack thing and Nvidia has done similar fuckery with 3DMark. I'm not against the feature, but I don't like it being a factor in purchasing a GPU. Just like I don't care for Ray-Tracing, since this technology is extremely early in being usable. It's nice, but don't claim it to be a feature that will determine if I'd buy it or not.
                    I'm going to start looking a lot more seriously at a RTX 3080, if prices keep going down. I would consider a 12 GB version, if I can get one at a discount relative to the comparable-performing RTX 4000 series card. That's how I ended up with my current GPU, which I got for a little over 50% below MSRP once the GTX 1000-series launched.
                    I'm gonna wait for cheaper AMD cards as I use Linux Mint. It's a lot less of a problem when I don't have to worry about what kernel I'm using that might upset the Nvidia proprietary drivers. I'm looking at RX 6700's or even 6600 XT's. The prices will collapse more as time goes on.

                    Comment


                    • Originally posted by qarium View Post

                      if you do not exclude any feature and you put all the features in the benchmark means DLSS2 and also raytracing Nvidia still has the efficiency cown. amd only wins if you turn of raytracing and if you do not benchmark with FSR2/DLSS2
                      I'm a pragmatic person. Find me benchmarks that prove this. Don't just say it and expect people to believe you. FIND IT.

                      Comment

                      Working...
                      X