Announcement

Collapse
No announcement yet.

An Update On The Radeon RX 590 For Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by V10lator View Post
    I guess they test with AMD reference cards only? Just asking because multiple people pointed out it might be a issue with cards from Sapphire. This conclusion has been made because Michael is testing a Sapphire card and there are at least two users of Sapphire rx 580 cards having hard issues with kernels >=4.19 (there doesn't seem to be a bug report through).
    Well, at least one with Sapphire said 4.19 works for him:

    https://www.phoronix.com/forums/foru...21#post1061821

    And for others you never know, some might use old mesa some git mesa or whatever is difference there Since working is with Ryzen maybe some Intel CPUs make something going wrong, who knows
    Last edited by dungeon; 21 November 2018, 09:41 AM.

    Comment


    • #32
      Originally posted by bridgman View Post
      There is also a PCIE subsystem regression in the kernel floating around that has been adding a lot of noise
      What regression you talking about? What commit id?

      Comment


      • #33
        Originally posted by dungeon View Post

        And for others you never know, some might use old mesa some git mesa or whatever is difference there Since working is with Ryzen maybe some Intel CPUs make something going wrong, who knows
        Yes, this might be a special combination of none reference cards with kernel 4.19 and some userspace components. I'm one oft the guys infected, so I'm able to rule out Intel CPUs (using AMD FX 8350). I'm using stable Mesa. Still didn't find time to test Mesa git and/or bisect the kernel but might do that the next days and open a bug report.

        //EDIT: bridgman I'm using PCIE Gen2 so I would also be interested in more information. For example: Do you know if this has been fixed in 4.20-rc3 ?
        Last edited by V10lator; 21 November 2018, 10:33 AM.

        Comment


        • #34
          Originally posted by RussianNeuroMancer View Post
          What regression you talking about? What commit id?
          Alex's patch references the following commit as the source of the regression...



          ... and the patch to fix is here:



          V10lator I don't believe the fix has gone upstream yet (didn't see it anyways).
          Test signature

          Comment


          • #35
            Thanks for links! Hopefully it will be merged soon.

            Comment


            • #36
              I mentioned it in the other thread, I'm seeing this issue with an XFX card on kernel 4.18, Fedora.

              Comment


              • #37
                Originally posted by prazola View Post

                Are you serius? First of all, RX 580 idles at 9 watts, NINE. Such a big improvement!
                Then, why should a GTX1060 handle 220watt when it can compete with polaris while consuming 120watt?
                https://www.techpowerup.com/reviews/...o_Plus/30.html
                So, if you don't play games 590 is less efficient than rx 580 and in gaming is less efficient than Vega 56. How can you call this card a win?
                A 12nm Hawaii would've been better than this useless refresh.
                Yes I am very serious about Rx580,
                Because I brought 2 cards for my company, and then they don't work there on Rocm..pcie atomics..
                So I had to brought the cards myself, since they don't serve their needs..( reputation is something that I want to maintain on my clients.. )

                And because of that, I own 2 Rx580 cards, not One!!
                And this subject was already discussed here several times..
                I myself know the consumption of the Rx580 on Idle, and its not 9 Watts, for that you would need a Rx560, not a Rx580 ( which idles around 32 watts in power saving mode and in headless mode, which were what I tested ).

                So, this numbers are already debated here, for some weeks ago, the lowest level that I saw in others are the same as mine 2 cards..

                It could be that, as the driver improves we could maybe see better power usage, until then this values are the real ones.
                I don't game on my cards, they are for GPU compute only, but for some tests that I saw briefly, the Rx590 seems to be more eficiente.

                I have compared a GTX 1060 6GB, because its my card at work, and I also knows what it his capable off, but above all, because its in same price range.

                And a GTx 1060 6GB, you can't push it up, it will burn for sure!
                I used this card to compare one thing...the power envelope which its stated on AMD card at plus 200Watts, but that is a Thermal Capability Limit, you don't need to push them til that limit, in that way power would be a lower than that...in Rx580 and in RX590 it would better...so more efficiency)

                In Games I don't know,
                But the GTX1060 6GB is a nice card, very efficient, idling in headless more at ~5-6Watts, but its thermal envelope is also a lot lower(even thoght it peaks around 230 Watts or so, briefly..)..
                It can't handle, the raw horse power of a RX580, because of that, that card is stated as 120-130Watts( If that card were made to be a workHorse, it for sure would have power limits of 200plus Watts...but that doesn't mean you will need to use all that capability.. ).

                The Drivers here also play a good role above all...and Nvidia here is on the front, too much years of development on Linux.
                Amd will for sure improve, with its new stack maturing





                Comment


                • #38
                  Originally posted by bridgman View Post

                  Alex's patch references the following commit as the source of the regression...

                  https://git.kernel.org/pub/scm/linux...5c40ea6c422c75

                  ... and the patch to fix is here:



                  V10lator I don't believe the fix has gone upstream yet (didn't see it anyways).
                  Hello all,

                  Sapphire Nitro+ Aktiv RX580, 8 GB, here since the beginning (May 2017)

                  01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Ellesmere [Radeon RX 470/480/570/570X/580/580X] (rev e7) (prog-if 00 [VGA controller])
                  Subsystem: Sapphire Technology Limited Nitro+ Radeon RX 580 8GB

                  Extended renderer info (GLX_MESA_query_renderer):
                  Vendor: X.Org (0x1002)
                  Device: Radeon RX 580 Series (POLARIS10, DRM 3.27.0, 4.20.0-rc1-1.g7262353-default+, LLVM 8.0.0) (0x67df)
                  Version: 19.0.0
                  Accelerated: yes
                  Video memory: 8192MB
                  Unified memory: no
                  Preferred profile: core (0x1)
                  Max core profile version: 4.5
                  Max compat profile version: 4.5

                  _All_ kernel and Mesa (git) versions worked / working fine without any gfx card firmware updates (are there any around from Sapphire apart from the Linux firmware git repo?).

                  As you can see, I'm running currently with Mesa git from 15 hours ago and amd-staging-drm-next (as most of the time ) taken from 33 hours ago.

                  Card is running in PCIe 2.0 slot since then, too.
                  Even without and with latest patch from Alex.

                  (The above one from patchwork integrated.)

                  But I do not use any S3/resume or the like stuff on this workstation/render server (Fujitsu/Xeon)!

                  min power isn't under ~32 W (with head HD 1920x1080)
                  min temp is at ~31°C (was ~28°C with older kernels)

                  amdgpu-pci-0100
                  Adapter: PCI adapter
                  vddgfx: +0.75 V
                  fan1: 898 RPM (min = 0 RPM, max = 3200 RPM)
                  temp1: +31.0°C (crit = +94.0°C, hyst = -273.1°C)
                  power1: 32.23 W (cap = 175.00 W)
                  Last edited by nuetzel; 21 November 2018, 09:58 PM.

                  Comment


                  • #39
                    Originally posted by dungeon View Post
                    Maybe coordination with Sapphire is needed too... for example see there they have 11 different RX 580 cards, several range of features and clocks, etc...
                    I would be very upset with Gigabyte if my AM4 motherboard didn't work with my CPU because of a buggy BIOS or something else was wrong. For some reason it's not Gigabyte's fault if a Gigabyte GPU doesn't work and it's not Sapphire's fault a Sapphire GPU doesn't work, it's all AMD's fault - specially AMD's Bridgeman's fault. AMDs add-in board partners have very thin margins, there is that consideration, but still.. When was the last time Gigabyte or MSI or ASUS contributed a line to the Linux kernel or MESA or .. well, anything? Just saying.
                    Last edited by xiando; 22 November 2018, 01:32 AM. Reason: typo

                    Comment


                    • #40
                      Originally posted by tuxd3v View Post

                      Yes I am very serious about Rx580,
                      Because I brought 2 cards for my company, and then they don't work there on Rocm..pcie atomics..
                      So I had to brought the cards myself, since they don't serve their needs..( reputation is something that I want to maintain on my clients.. )

                      And because of that, I own 2 Rx580 cards, not One!!
                      And this subject was already discussed here several times..
                      I myself know the consumption of the Rx580 on Idle, and its not 9 Watts, for that you would need a Rx560, not a Rx580 ( which idles around 32 watts in power saving mode and in headless mode, which were what I tested ).

                      So, this numbers are already debated here, for some weeks ago, the lowest level that I saw in others are the same as mine 2 cards..

                      It could be that, as the driver improves we could maybe see better power usage, until then this values are the real ones.
                      I don't game on my cards, they are for GPU compute only, but for some tests that I saw briefly, the Rx590 seems to be more eficiente.

                      I have compared a GTX 1060 6GB, because its my card at work, and I also knows what it his capable off, but above all, because its in same price range.

                      And a GTx 1060 6GB, you can't push it up, it will burn for sure!
                      I used this card to compare one thing...the power envelope which its stated on AMD card at plus 200Watts, but that is a Thermal Capability Limit, you don't need to push them til that limit, in that way power would be a lower than that...in Rx580 and in RX590 it would better...so more efficiency)

                      In Games I don't know,
                      But the GTX1060 6GB is a nice card, very efficient, idling in headless more at ~5-6Watts, but its thermal envelope is also a lot lower(even thoght it peaks around 230 Watts or so, briefly..)..
                      It can't handle, the raw horse power of a RX580, because of that, that card is stated as 120-130Watts( If that card were made to be a workHorse, it for sure would have power limits of 200plus Watts...but that doesn't mean you will need to use all that capability.. ).

                      The Drivers here also play a good role above all...and Nvidia here is on the front, too much years of development on Linux.
                      Amd will for sure improve, with its new stack maturing
                      So 13 watts it's true even if you don't have the card, but the 9 watts every review reported is not true. Ok.
                      "And a GTx 1060 6GB, you can't push it up, it will burn for sure!" well, this statement is an embarassing statement to be honest.
                      GTX 1060 can go from 1.6GHz to 2GHz without burning.
                      The data is out there and if you want to say that every review and user are wrong then don't be surprised if people wont believe you.
                      Also, I'm an AMD user as well, but I'm objective.

                      Comment

                      Working...
                      X