Announcement

Collapse
No announcement yet.

Linux Support Expectations For The AMD Radeon RX 6000 Series

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by piotrj3 View Post

    If that raw performance translates to Linux. AMD drivers are always big question mark. Also keep in mind AMD in benchmarks used some "smart memory" "rage mode" which probably aren't raw performance.
    where are the big question marks? just recently Michael did a benchmark over multiple current graphics cards. The ranking was almost like what you see with windows. linux amd drivers tend to be a bit better then the windows ones - at least stability wise. so show me a bad current AMD card driver (linux).

    edit: here is the link https://www.phoronix.com/scan.php?pa...e-ampere&num=5
    but driver for 5700XT got better since.
    Last edited by CochainComplex; 29 October 2020, 06:54 AM.

    Comment


    • #22
      I'm hoping for a 6700XT at around the $400-$450 price point.

      Comment


      • #23
        I have been waiting for this launch for so long. But now I have more questions. In order to use some features you need the Windows app. Will there ever be way to get the equivalent functionality in Linux? If I stick the card in a windows machine tune it there will it remember the setting when I move it back to Linux?

        "AMD Smart Access technology" - What exactly is that? Is it letting the CPU directly access the Infinity Cache? If so I saw a YT video yesterday where some one had seen some code pushed referring to the Infinity cache so maybe support is on the way.

        They really glossed over the parts I was most interested in. AV1 decode, how the compute cores have changed etc. But some thing I saw today was that it looks like it has hardware denoising. I can't imagine where you would get noise in a computer generated image. But if this is for photo/video work I could be super excited.


        Comment


        • #24
          These things pair with zen 3 system cache to get a 8-16% perf boost.
          The main issue I see on the linux side is 'rage mode' looks like a windows only clock setting. And that gives another 10% boost.
          Also dlss/rt. They'll have to really do some intel tier software development to get that flying right.
          Bonus across the board, is that they are 50-75 watts lower than nvidia, but this will all need to be 3rd party benched before that can float.

          Comment


          • #25
            My thoughts on big navi and initial Linux support:

            Comment


            • #26
              Originally posted by pal666 View Post
              at least on paper amd delivered faster, less power hungry and cheaper cards than novideo
              As usual classic flamer who doesn't get banned doesn't understand context.
              Even if you buy AMD, why would you buy 6900XT, when 6800XT exists, with same memory, a lot cheaper, and only slighty slower and on top of that official benchmarking figures don't rely on some "smart memory" or "rage mode OC" in 6800XT. Why would you put 350$ more on card that can't do anything more. They are both 4k gaming cards without anything beyond it and at 4k diffrence between them will be small. They are both not capable of 8k due to VRAM, they both have all exactly technologies etc. At least 3090 has vram for some 8K stuff and specific features like DLSS to make it possible. there is literally no point to buy 6900XT, when 6800XT or RTX3080 exist, you can just buy any of those some great versions with maybe water cooling or literally the best air cooling model on market, slighty OC 6800XT/RTX3080 and you are almost the same tier performance without them being loud.
              Last edited by piotrj3; 28 October 2020, 04:48 PM.

              Comment


              • #27
                Originally posted by MadeUpName View Post
                But some thing I saw today was that it looks like it has hardware denoising. I can't imagine where you would get noise in a computer generated image. But if this is for photo/video work I could be super excited.
                The output from raytracing needs to be denoised.

                Comment


                • #28
                  Originally posted by piotrj3 View Post

                  As usual classic flamer who doesn't get banned doesn't understand context.
                  Even if you buy AMD, why would you buy 6900XT, when 6800XT exists, with same memory, a lot cheaper, and only slighty slower and on top of that official benchmarking figures don't rely on some "smart memory" or "rage mode OC" in 6800XT. Why would you put 350$ more on card that can't do anything more. They are both 4k gaming cards without anything beyond it and at 4k diffrence between them will be small. They are both not capable of 8k due to VRAM, they both have all exactly technologies etc. At least 3090 has vram for some 8K stuff and specific features like DLSS to make it possible. there is literally no point to buy 6900XT, when 6800XT or RTX3080 exist, you can just buy any of those some great versions with maybe water cooling or literally the best air cooling model on market, slighty OC 6800XT/RTX3080 and you are almost the same tier performance without them being loud.
                  I am eyeing the 6900 for the extra compute units. I don't game, but for video editing or some thing like Blender it may be a real winner.

                  Comment


                  • #29
                    The lack of a DLSS competitor and (presumably) comparitively poor RT performance is a real shame. DLSS gives nvidia cards a really long tail in terms of future performance. RT is only going to become more relevant as time goes on too.

                    Comment


                    • #30
                      Originally posted by MadeUpName View Post

                      I am eyeing the 6900 for the extra compute units. I don't game, but for video editing or some thing like Blender it may be a real winner.
                      The diffrence between them is 8 compute units (72 vs 80) clocked the same way, but also both cards have exactly same TDP. So in reality you will have slighty faster 72 compute units vs slighty sloer 80 compute units. Honestly from price point it doesn't compute.

                      Also for Blender, unless AMD also pulls out something like Optix, AMD is just not worth it in GPU segment.

                      Comment

                      Working...
                      X