Announcement

Collapse
No announcement yet.

The Latest Linux 5.2 + Mesa 19.2 Radeon Performance Against NVIDIA With Mid-Range GPUs

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • The Latest Linux 5.2 + Mesa 19.2 Radeon Performance Against NVIDIA With Mid-Range GPUs

    Phoronix: The Latest Linux 5.2 + Mesa 19.2 Radeon Performance Against NVIDIA With Mid-Range GPUs

    With the Linux 5.2 kernel a few weeks out from its stable release and now being in the middle of the Mesa 19.2 development cycle for the RADV Vulkan and RadeonSI OpenGL drivers, here are some fresh results looking at the latest open-source AMD Radeon Linux graphics driver stack compared to the latest NVIDIA proprietary graphics driver. In this article the focus is on the mid-range (Polaris) line-up against the NVIDIA competition while similar tests on the high-end are currently being carried out.

    http://www.phoronix.com/vr.php?view=27991

  • #2
    Originally posted by phoronix
    Strange Brigade is another game running on Linux thanks to Valve's Steam Play and DXVK.
    No, Strange Brigade is a native Vulkan title. Doesn't use DXVK.

    Comment


    • #3
      Those 360 graphs are so much easier to read out data from than the old graphs.

      Comment


      • #4
        I've been very happy with my RX 580 for 1080p gaming on a Plasma desktop with a lot of effects enabled. It has also been undervolted quite a bit which has helped with thermals which increased performance while lowering power usage.

        Comment


        • #5
          Moving to open source drivers was a great decision by AMD. Linux might be a small market, but they went from being almost a joke to the go-to choice for Linux users. Even though they're not *really* beating Nvidia in performance.

          Comment


          • #6
            Originally posted by xeekei View Post
            Moving to open source drivers was a great decision by AMD. Linux might be a small market, but they went from being almost a joke to the go-to choice for Linux users. Even though they're not *really* beating Nvidia in performance.
            If you've ever had a bad update with Catalyst or the Nvidia driver, what AMD had done for us has been a godsend (unless one bought not-very-supported hardware).

            While performance and PPW matters, knowing my GPU is technically capable of working on multiple operating systems on multiple architectures with stock software matters as well.

            Should add that I switched from Nvidia to AMD in 2012/2013 after reading about AMDGPU and Linux finally getting an open source graphics stack from a brand name hardware vendor. I can vouch that they've had at least two GPU purchases directly related to that...my two

            If one is going to deal with binary blobs, go with Nvidia. If one doesn't want to deal with binary blobs, AMD makes a great alternative. It's a compromise between performance and compatibility IMHO. Both companies make good products, but I prefer AMD due to their openness.

            Comment


            • #7
              gigabyte produces a rather weak 570 if it were a shappire nitro + the results would be much better.
              gigabyte is good at motherboards but at graphics cards is mediocre, not the worst but definitely not the best.
              some years ago the ads on this page convinced me to buy gigabyte, if they were advertising on phoronix then they should be good, since then, I had 3 vids cards and two mobos from gigabyte, honestly imho in the gpu front they are rather weak.
              Last edited by Kayote; 06-21-2019, 07:59 PM.

              Comment


              • #8
                Originally posted by skeevy420 View Post
                I've been very happy with my RX 580 for 1080p gaming on a Plasma desktop with a lot of effects enabled. It has also been undervolted quite a bit which has helped with thermals which increased performance while lowering power usage.
                Tell us more - What you did? - How do you did it? - Thanks!

                Comment


                • #9
                  Originally posted by nuetzel View Post

                  Tell us more - What you did? - How do you did it? - Thanks!
                  I kept noticing odd stutters in Tomb Raider (2013) around 3 minutes in every time I played it, opened up WattmanGTK when I was playing and saw that the GPU was hitting 85C and dropping down from state 7 to state 3, and from there I started lowering state 7 until it was the same as state 6, started lowering 6 and 7 until they were the same as 5, then 5-7 until they were the same as 4, and then 4-7 until they were the same as state 3. After that I lowered the power cap to 125W, overclocked* my memory to 2000mhz and lowered its state 2 by 25mv.

                  My theory is that if an RX card isn't pushing 2K+ graphics, they just don't need the high stock voltages they come with. Since I'm only targeting 1080p60 on a GPU designed for 2K freesync, there was quite a bit of voltage headroom I just didn't need.

                  Then I created a systemd unit to load my WattmanGTK script on boot. Been running the below since March.

                  /usr/local/bin/Set_WattmanGTK_Settings.sh
                  Code:
                  #!/bin/bash
                  echo "manual" > /sys/devices/pci0000:00/0000:00:03.0/0000:03:00.0/power_dpm_force_performance_level
                  echo 125000000 > /sys/class/hwmon/hwmon3/power1_cap
                  echo "s 0 300 750" > /sys/devices/pci0000:00/0000:00:03.0/0000:03:00.0/pp_od_clk_voltage
                  echo "s 1 600 769" > /sys/devices/pci0000:00/0000:00:03.0/0000:03:00.0/pp_od_clk_voltage
                  echo "s 2 918 912" > /sys/devices/pci0000:00/0000:00:03.0/0000:03:00.0/pp_od_clk_voltage
                  echo "s 3 1167 1075" > /sys/devices/pci0000:00/0000:00:03.0/0000:03:00.0/pp_od_clk_voltage
                  echo "s 4 1239 1075" > /sys/devices/pci0000:00/0000:00:03.0/0000:03:00.0/pp_od_clk_voltage
                  echo "s 5 1282 1075" > /sys/devices/pci0000:00/0000:00:03.0/0000:03:00.0/pp_od_clk_voltage
                  echo "s 6 1326 1075" > /sys/devices/pci0000:00/0000:00:03.0/0000:03:00.0/pp_od_clk_voltage
                  echo "s 7 1366 1075" > /sys/devices/pci0000:00/0000:00:03.0/0000:03:00.0/pp_od_clk_voltage
                  echo "m 0 300 750" > /sys/devices/pci0000:00/0000:00:03.0/0000:03:00.0/pp_od_clk_voltage
                  echo "m 1 1000 800" > /sys/devices/pci0000:00/0000:00:03.0/0000:03:00.0/pp_od_clk_voltage
                  echo "m 2 2000 875" > /sys/devices/pci0000:00/0000:00:03.0/0000:03:00.0/pp_od_clk_voltage
                  echo "c" > /sys/devices/pci0000:00/0000:00:03.0/0000:03:00.0/pp_od_clk_voltage
                  /etc/systemd/system/system/wattmanGTK.service
                  Code:
                  [Unit]
                  Description=Apply wattmanGTK settings
                  
                  [Service]
                  Type=oneshot
                  ExecStart=/usr/local/bin/Set_WattmanGTK_Settings.sh
                  RemainAfterExit=yes
                  
                  [Install]
                  WantedBy=multi-user.target
                  Since then I've modded the bios with Polaris Bios Editor and flashed that with a linux version of ATIFLASH. Yes, it's 100% possible to mod a bios from Linux with a Windows tool, Wine, and an unknown sourced linux binary. All I did with PBE is let it automagically tighten the memory timings and nothing else. Everything else gets done via the systemd unit and script.

                  *Most 580s have the memory set at 2000mhz by default, mine is a cheapo MSI Armor 4GB that came with 1750mhz.

                  Comment


                  • #10
                    Vega 56 can now be had new for well below $300. I'd call that "mid-range" and quite competitive with 1660Ti and the like.
                    https://www.newegg.com/gigabyte-rade...82E16814932030

                    Comment

                    Working...
                    X