Announcement

Collapse
No announcement yet.

Radeon RX 5600 XT With New vBIOS Offering Better Linux Performance Following Fix

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Radeon RX 5600 XT With New vBIOS Offering Better Linux Performance Following Fix

    Phoronix: Radeon RX 5600 XT With New vBIOS Offering Better Linux Performance Following Fix

    Earlier this week AMD launched the Radeon RX 5600 XT and as shown in our Linux launch-day review it offers nice performance up against the GTX 1660 and RTX 2060 graphics cards on Linux with various OpenGL and Vulkan games. Complicating the launch was the last-minute change to the video BIOS to offer better performance, but unfortunately that led to an issue with the Linux driver as well as confusing the public due to the change at launch and some board vendors already shipping the new vBIOS release while others are not yet. Fortunately, a Linux solution is forthcoming and in our tests it is working out and offering better performance.

    http://www.phoronix.com/vr.php?view=28814

  • #2
    Wow, that's actually a pretty substantial improvement. Basically went from "barely better than Vega 56" to "roughly as good as Vega 64" at a much lower wattage. Granted, the power consumption did spike a lot.

    Comment


    • #3
      Originally posted by phoronix View Post
      Phoronix: Radeon RX 5600 XT With New vBIOS Offering Better Linux Performance Following Fix

      Earlier this week AMD launched the Radeon RX 5600 XT and as shown in our Linux launch-day review it offers nice performance up against the GTX 1660 and RTX 2060 graphics cards on Linux with various OpenGL and Vulkan games. Complicating the launch was the last-minute change to the video BIOS to offer better performance, but unfortunately that led to an issue with the Linux driver as well as confusing the public due to the change at launch and some board vendors already shipping the new vBIOS release while others are not yet. Fortunately, a Linux solution is forthcoming and in our tests it is working out and offering better performance.

      http://www.phoronix.com/vr.php?view=28814
      Some notes/observations:
      • The results are indicating that the 256-bit GDDR6 memory interface in RX 5700 is unnecessarily wide and that the 192-bit memory interface in RX 5600 XT is fully sufficient for most games and benchmarks. Maybe if AMD had used 128-bit interface in RX 5600 then it would be cheaper to manufacture, and it wouldn't be in the same performance bin as RX 5700 thus performance-wise it would be right in the middle between RX 5500 and RX 5700.
      • Strangely, RX 5700 (which has the same GPU silicon as RX 5600 XT) consumes less power (74 Watts) than RX 5600 XT with the new BIOS (95 Watts) which most likely means that RX 5600 XT (new BIOS) voltages are by the BIOS configured to be higher than RX 5700 voltages

      Comment


      • #4
        Originally posted by atomsymbol View Post
        • The results are indicating that the 256-bit GDDR6 memory interface in RX 5700 is unnecessarily wide and that the 192-bit memory interface in RX 5600 XT is fully sufficient for most games and benchmarks.
        192-bit is maybe enough, but 6 GB is already quite tight. Some reviewers managed to max out the 5600 XT VRAM even at 1080p (so much for "ultimate 1080p gaming"). If 12 GB models were released, those would be very interesting.

        Comment


        • #5
          Originally posted by chithanh View Post
          192-bit is maybe enough, but 6 GB is already quite tight. Some reviewers managed to max out the 5600 XT VRAM even at 1080p (so much for "ultimate 1080p gaming"). If 12 GB models were released, those would be very interesting.
          In my opinion, 12 GB is too much today as far as gaming is concerned and it isn't going to be required until PS5 and Xbox Series X are released which are expected to have at least 12 GB of GDDR6.

          Because you mentioned 6 GB (which I didn't expect): Do you mean that those reviewers reached the limits of VRAM bandwidth or VRAM capacity? 192-bit vs 256-bit is about bandwidth, not about capacity.

          Comment


          • #6
            Typo:

            Originally posted by phoronix View Post
            Strange Brigade is another Steam Play titlle on Linux
            Originally posted by phoronix View Post
            and even allowd the graphics card
            Also, is it possible to see high-end NVIDIA numbers for the Tomb Raider bench? AMD is too fast there.
            Last edited by tildearrow; 01-24-2020, 04:18 PM.

            Comment


            • #7
              Originally posted by atomsymbol View Post

              In my opinion, 12 GB is too much today as far as gaming is concerned and it isn't going to be required until PS5 and Xbox Series X are released which are expected to have at least 12 GB of GDDR6.

              Because you mentioned 6 GB (which I didn't expect): Do you mean that those reviewers reached the limits of VRAM bandwidth or VRAM capacity? 192-bit vs 256-bit is about bandwidth, not about capacity.
              4K gaming drastically raises the VRAM requirements. Certain games have even used as much as 9GB on my 1080ti @ 1440p.

              The new consoles will have 16GB of GDDR6, but it will be used as system RAM as well.

              AMD really needs to adjust pricing downwards, but they are stuck. GDDR6 pricing is going to be climbing throughout the year and AMD also has to pay for all of that R&D. What we are seeing is something unprecedented: a company in fire straights is picking itself up by it's bootstraps and turning things around. The only other tech company that I remember doing that is Apple.

              Comment


              • #8
                Originally posted by betam4x View Post
                4K gaming drastically raises the VRAM requirements. Certain games have even used as much as 9GB on my 1080ti @ 1440p.
                Yes, but the question from memory bandwidth viewpoint is how big the working set is that the game is typically making use of in 1 second. Two main cases to consider are:
                • The game character is standing still (not changing viewpoint in the game) thus the same set of textures appears on the screen every frame
                • The game character rotates its virtual body or head 180 (90) degrees horizontally (vertically)
                Textures not in the working set can be, by definition without affecting performance, uploaded to the GPU through PCI Express and/or be stored in GPU memory in a compressed form.

                Just a note: 3840*2160*4 = 32 megabytes, which means that the display resolution as such is increasing memory consumption just by N*32, where N is a small integer number reflecting how many 4K rendering buffers the game is utilizing (at least: color buffer, depth buffer).

                Originally posted by betam4x View Post
                The new consoles will have 16GB of GDDR6, but it will be used as system RAM as well.
                I haven't encountered any reliable website which would be disclosing how many GB of memory will be in PS5 or in the next Xbox.
                Last edited by atomsymbol; 01-24-2020, 06:48 PM. Reason: Fix grammar

                Comment


                • #9
                  Originally posted by atomsymbol View Post

                  Yes, but the question from memory bandwidth viewpoint is how big the working set is that the game is typically making use of in 1 second. Two main cases to consider are:
                  • The game character is standing still (not changing viewpoint in the game) thus the same set of textures appears on the screen every frame
                  • The game character rotates its virtual body or head 180 (90) degrees horizontally (vertically)
                  Textures not in the working set can be, by definition without affecting performance, uploaded to the GPU through PCI Express and/or be stored in GPU memory in a compressed form.

                  Just a note: 3840*2160*4 = 32 megabytes, which means that the display resolution as such is increasing memory consumption just by N*32, where N is a small integer number reflecting how many 4K rendering buffers the game is utilizing (at least: color buffer, depth buffer).



                  I haven't encountered any reliable website which would be disclosing how many GB of memory will be in PS5 or in the next Xbox.
                  That's not quite how 3D graphics work. All textures used in a game must be uploaded in to VRAM or you will get extremely poor performance. It doesn't matter if the camera is stationary or not. Latencies of PCIE are currently too high for real time texture streaming from VRAM, don't even get me started on storage.

                  Then there is also the size of the textures themselves, the shaders as well as shader output, geometry, etc.

                  EDIT: Also, per the DirectX 11 spec, textures can be up to 16384x16384 in size. Do the math and you'll see how quickly that VRAM can go.
                  Last edited by betam4x; 01-24-2020, 07:03 PM.

                  Comment


                  • #10
                    Originally posted by betam4x View Post
                    That's not quite how 3D graphics work. All textures used in a game must be uploaded in to VRAM or you will get extremely poor performance. It doesn't matter if the camera is stationary or not. Latencies of PCIE are currently too high for real time texture streaming from VRAM, don't even get me started on storage.

                    Then there is also the size of the textures themselves, the shaders as well as shader output, geometry, etc.
                    If what you wrote was true then open-world games without loading screens (such games do exist) would not exist.

                    Originally posted by betam4x View Post
                    EDIT: Also, per the DirectX 11 spec, textures can be up to 16384x16384 in size. Do the math and you'll see how quickly that VRAM can go.
                    By this logic, a program running in a 64-bit address space needs 2**64 = 18446744073709551616 bytes of RAM to complete its task.
                    Last edited by atomsymbol; 01-24-2020, 07:11 PM.

                    Comment

                    Working...
                    X