Announcement

Collapse
No announcement yet.

AMD Radeon RX 5500 XT Linux Performance

Collapse
This topic is closed.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Venemo View Post
    Michael, could you please give it a quick try with ACO? I'm very curious if ACO works well on the 5500 but don't have one myself for testing.
    Yes I have ACO results that will be posted later today or tomorrow.
    Michael Larabel
    http://www.michaellarabel.com/

    Comment


    • #12
      Originally posted by arQon View Post
      I'm not sure I agree with that. Cards in this price range are for titles like Overwatch / Rocket League / etc - they don't need huge amounts of VRAM at all. If websites are claiming nontrivial uplift on the 8G version, either something's Very Broken or they're talking about AAA titles running at Ultra settings. That's so far from what's appropriate for low-end cards like this as to be bordering on meaningless and artificial just for the sake of showing any difference at all. If the 8GB version gets 40fps but the 4GB version only gets 25fps because of all the texture thrash, it's still basically irrelevant anyway - neither one is actually suitable for that game at those settings in the first place.
      Kind of funny you should point this out - on a different site, I'm currently arguing with someone over how 4GB is doable if you just lower texture settings. You save a large chunk of VRAM for a near-negligible visual difference on 1080p. Most modern AAA games have ultra texture settings for the sake of 4K displays. So yeah, I don't really disagree with you there, but as many will tell you, the mid-range GPU market is over-saturated. If you want to play games like Overwatch or Rocket League on a budget, there are plenty of other options. I really don't think anybody would have a problem with a 5500XT 4GB model not existing. But, a 5500 with 8GB might be unnecessary.

      Comment


      • #13
        Michael Any chance for some compute (OpenCL/ROCm/HIP) kind of benchmarks?
        or is there no driver support for opencl/rocm for these cards at all yet?
        Last edited by TOMBOMBADIL; 12 December 2019, 01:14 PM.

        Comment


        • #14
          Originally posted by TOMBOMBADIL View Post
          Michael Any chance for some compute (OpenCL/ROCm/HIP) kind of benchmarks?
          or is there no driver support for opencl/rocm for these cards at all yet?
          No ROCm for Navi yet....
          Michael Larabel
          http://www.michaellarabel.com/

          Comment


          • #15
            The state of AMD opensource driver is the major reason for me to buy AMD igpu (raven ridge) or dgpu 5700 XT . The drivers of my quadro t1000 are terrible ...

            Comment


            • #16
              Nice review!
              It's a bit disappointing that they didn't used the latest HDMI version for this card
              Why only HDMI 2.0b when HDMI 2.1 was released 2 years ago ?
              Another disappointment is no AV1 hardware decoding acceleration, at least we should have VP9 hardware decoding, if it will ever work on Linux.
              I hope to see in the future benchmarks how well it handles games like "GTA V" and others through WINE / DXVK / Proton.

              I'm interested in the MSI version of this card because MSI has a feature called "Zero RPM" where the fans are stopped completely if the temperature is not too high, which is very good for the much wanted silence.
              Now I'm wondering if this feature works on Linux ?
              On Windows I can see it enabled in the AMD control panel, but on Linux without a control panel, I have no idea if it works or not.

              Comment


              • #17
                Originally posted by schmidtbag View Post
                Kind of funny you should point this out - on a different site, I'm currently arguing with someone over how 4GB is doable if you just lower texture settings. You save a large chunk of VRAM for a near-negligible visual difference on 1080p. Most modern AAA games have ultra texture settings for the sake of 4K displays. So yeah, I don't really disagree with you there, but as many will tell you, the mid-range GPU market is over-saturated. If you want to play games like Overwatch or Rocket League on a budget, there are plenty of other options. I really don't think anybody would have a problem with a 5500XT 4GB model not existing. But, a 5500 with 8GB might be unnecessary.
                I personally put texture quality over lightning/shader in any game, as at last to my eyes, is what really matter to make a game look good, even at 1080p. A example of this is Deus Ex: MD, where the game alert that my 4GB RX570 falls short of the desired specification for high quality textures, and that is a 3 years old game.

                And we buy things looking to the future. New games will not get any lighter on memory, so if I was buying one of those cards, 20/30 Dollars for double the memory is a wise investment, as the next gen consoles are rumored to have 16 to 24 GB of RAM.

                Comment


                • #18
                  micheal
                  Did you ever try this https://github.com/M-Bab/linux-kerne...naries?files=1 ? I've seen decent performance differences over the normal kernels with this one.

                  Cheers

                  Comment


                  • #19
                    Originally posted by [email protected] View Post
                    I personally put texture quality over lightning/shader in any game, as at last to my eyes, is what really matter to make a game look good, even at 1080p. A example of this is Deus Ex: MD, where the game alert that my 4GB RX570 falls short of the desired specification for high quality textures, and that is a 3 years old game.
                    I agree with your priorities, but, I'm not about to max out my texture settings if it means my game crashes every 10 minutes. I don't care if something looks better at ultra textures when I press my face up against the wall, because why am I bothering to do that? Play a modern AAA title at 1080p in a sane matter (so, not pressing your face up against the wall) and the difference between high and max settings is negligible 99% of the time.
                    And we buy things looking to the future. New games will not get any lighter on memory, so if I was buying one of those cards, 20/30 Dollars for double the memory is a wise investment, as the next gen consoles are rumored to have 16 to 24 GB of RAM.
                    True, though isn't that shared memory? Keep in mind, next-gen consoles will be 4K-ready, so, they're going to have those ultra textures.

                    Comment


                    • #20
                      Is it possible seeing some benchmarks for other applications rather than just video games? I mean GIMP, Inkscape, Krita, Kdenlive, Natron, Blender, Imagemagick, Gmic, Darkroom, etc... The Linux gamers are the 1% of the 1% global Linux base.

                      Comment

                      Working...
                      X