Announcement

Collapse
No announcement yet.

AMD Radeon RX 6500 XT Linux Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by sykobee View Post
    On the other hand, all the games benchmarked were playable, which suggests we need more stressful games in the benchmark suite - but finding scriptable automateable benchmarkable games is probably the big difficulty here.

    Optimising the game settings is necessary for this card, to ensure it can run well in the 4GB memory without needing too much PCIe bandwidth, especially on PCIe 3.0 boards. Benchmarks often don't do this, as they keep settings consistent for comparative purposes.

    It's such a shame that for the sake of a few mm^2 of silicon, PCIe bandwidth was trashed (for the design aim, a mobile GPU companion for PCIe 4.0 equipped CPUs/APUs with AV1 decode already, it's fine) and memory limited to 4GB (96-bit could have allowed 6GB), and AV1 decode was killed.
    There are more demanding games and settings available but then it's like unplayable 20 FPS comparison for the low-end cards.
    Michael Larabel
    https://www.michaellarabel.com/

    Comment


    • #12
      Originally posted by sykobee View Post
      Optimising the game settings is necessary for this card, to ensure it can run well in the 4GB memory without needing too much PCIe bandwidth, especially on PCIe 3.0 boards. Benchmarks often don't do this, as they keep settings consistent for comparative purposes.
      With Proton especially one needs to pay attention not to approach the GPU memory limit, otherwise compared to native gaming on Windows performance can be severely affected, since apparently video memory management is not as effective. This will likely be even more noticeable with the 6500XT due to its limited PCIe bandwidth.

      Occupied GPU memory also includes that from background GUI applications, so it often helps to close them down to keep gaming performance at acceptable levels. Reviews typically do not test this (i.e. they test games in a reasonably minimal environment), while in real-world usage people generally leave at least some applications running in the background.
      Last edited by Solid State Brain; 31 January 2022, 09:32 AM. Reason: wording

      Comment


      • #13
        Originally posted by Quackdoc View Post
        I hate that I have to say this, but realistically speaking, going by my prices, my choice is between a new RX6500xt for 345cad, or a used RX 580 for 350, I'm still going to buy the rx6500xt, for the warranty alone. so as much as I hate to say it. amd new exactly what was going to happen, and made the best card for the situation... well, for them.
        If I were in the same position, I'd probably go with the 580. Polaris support is way more mature than RDNA2 so you don't have to run latest Mesa and kernels to have smooth experience. I don't know what kind of warranty you get in Canada but if it's just 1 year, I'd keep my fingers crossed and hoped that AMD would see that we're not going to buy any piece of junk just because it happens not to be hopelessly out of stock.

        Comment


        • #14
          Originally posted by MadCatX View Post

          If I were in the same position, I'd probably go with the 580. Polaris support is way more mature than RDNA2 so you don't have to run latest Mesa and kernels to have smooth experience. I don't know what kind of warranty you get in Canada but if it's just 1 year, I'd keep my fingers crossed and hoped that AMD would see that we're not going to buy any piece of junk just because it happens not to be hopelessly out of stock.
          nvm, the one I was looking at just sold out LOL next cheapest is 400 cad, but generally warranty is 3-5 years on stuff I buy.

          Comment


          • #15
            That card is a shame and I don't think there is much to be gained with optimisations. Slower than its direct predecessor? Really? That should have been a 6400 at best.

            Most times when AMD releases a not so great product they at least do it with a good price so it has its market. I thought they restructured their marketing department? I hope this isn't the start of the next 10 years of AMD on the edge.

            Originally posted by Quackdoc View Post
            I hate that I have to say this, but realistically speaking, going by my prices, my choice is between a new RX6500xt for 345cad, or a used RX 580 for 350, I'm still going to buy the rx6500xt, for the warranty alone. so as much as I hate to say it. amd new exactly what was going to happen, and made the best card for the situation... well, for them.
            Get the 580 or even 590 used but with 8GB, it will last you longer than this 6500XT. The only feature this new card has over 5XX is ray tracing and realistically that card is much to slow to ever use its ray tracing in games.

            Comment


            • #16
              Originally posted by Michael View Post

              Yes as shown on the system table.
              Ah that's the BAR1 / Visible vRAM Size: 4080 MB line? Ok!

              Comment


              • #17
                In other news Ars Technica has reviewed a fully open source (both software and hardware) ARM laptop:

                https://arstechnica.com/gadgets/2022...tter-or-worse/

                Michael

                Not sure if anyone's interested.

                Comment


                • #18
                  Originally posted by ernstp View Post

                  Ah that's the BAR1 / Visible vRAM Size: 4080 MB line? Ok!
                  Yep that is where its reported for both NVIDIA and AMD.
                  Michael Larabel
                  https://www.michaellarabel.com/

                  Comment


                  • #19
                    I'm not even sure why AMD bothered with this when it can't even beat the 5500 XT. Performancewise, I'm sure it beats it in pricing.

                    Comment


                    • #20
                      So if my RX 580 dies the 4 or 5 year newer card is a downgrade? Wow. Just Wow.

                      Comment

                      Working...
                      X