Announcement

Collapse
No announcement yet.

AMD Radeon RX 5600 XT Linux Gaming Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • AMD Radeon RX 5600 XT Linux Gaming Performance

    Phoronix: AMD Radeon RX 5600 XT Linux Gaming Performance

    As announced back at CES, the Radeon RX 5600 XT is being launched as the newest Navi graphics card to fill the void between the original RX 5700 series and the budget RX 5500 XT. The Radeon RX 5600 XT graphics cards are beginning to ship today at $279+ USD price point and offers great Linux support but with one last minute -- and hopefully very temporary -- caveat.

    http://www.phoronix.com/vr.php?view=28809

  • #2
    Seems to me it could've gone with a 6-pin power connector, but knowing Sapphire, they probably added the extra 2 pins for the sake of overclocking.

    Comment


    • #3
      Oh, it seems to be more efficient than nVidia equivalent, nice.

      Comment


      • #4
        Originally posted by schmidtbag View Post
        Seems to me it could've gone with a 6-pin power connector, but knowing Sapphire, they probably added the extra 2 pins for the sake of overclocking.
        I'm not sure about that. TPU and other reviewers have measured the peak power consumption at ~176W (actually 202W under Fur Mark) which absolutely necessitates a 8pin power connector as 6pin is only good for 150W (of which 75W are delivered by the motherboard).

        Originally posted by xxmitsu View Post
        Oh, it seems to be more efficient than nVidia equivalent, nice.
        Except the NVIDIA cards features RTX/DLSS blocks and is produced using a much inferior node.
        Last edited by birdie; 01-21-2020, 10:36 AM.

        Comment


        • #5
          Also, the new BIOS increase clock and power consumption ~10% for up to 20% perf increase in windows tests, don't know how it will translate in linux and perf/watt

          Comment


          • #6
            Originally posted by birdie View Post
            I'm not sure about that. TPU and other reviewers have measured the peak power consumption at ~176W (actually 202W under Fur Mark) which absolutely necessitates a 8pin power connector as 6pin is only good for 150W (of which 75W are delivered by the motherboard).
            Ah yes, in that case, the 8-pin is necessary.

            Comment


            • #7
              Originally posted by birdie View Post

              I'm not sure about that. TPU and other reviewers have measured the peak power consumption at ~176W (actually 202W under Fur Mark) which absolutely necessitates a 8pin power connector as 6pin is only good for 150W (of which 75W are delivered by the motherboard).



              Except the NVIDIA cards features RTX/DLSS blocks and is produced using a much inferior node.
              resuming more or less performance / watt of rtx 2060 but one year later with more lower node 7nm as your said

              in good news this put pressure for nvidia get out 12nm

              Last edited by pinguinpc; 01-21-2020, 11:23 AM.

              Comment


              • #8
                Originally posted by xxmitsu View Post
                Oh, it seems to be more efficient than nVidia equivalent, nice.
                It's only like 3-4% more efficient which is basically the same. Not even unexpected, the 5700 was slightly more efficient than the 2060 Super as well.

                Comment


                • #9
                  Originally posted by birdie View Post
                  Except the NVIDIA cards features RTX/DLSS blocks and is produced using a much inferior node.
                  Ray Tracing on the 2060 is not really usable, I consider the card too weak to handle that.
                  DLSS to me seems more like a marketing gimmick than an actually useful feature.
                  Why should I care at what node is it produced? I don' think your average consumer cares either. If they can achieve lower power consumption than their competition, then great.

                  Full disclaimer, I'm an AMD fan and I would not consider a NV GPU for myself. This is however different when recomending to family or other people, then I try to consider all of the big three.

                  Comment


                  • #10
                    Originally posted by StandaSK View Post

                    Ray Tracing on the 2060 is not really usable, I consider the card too weak to handle that.
                    DLSS to me seems more like a marketing gimmick than an actually useful feature.
                    Why should I care at what node is it produced? I don' think your average consumer cares either. If they can achieve lower power consumption than their competition, then great.

                    Full disclaimer, I'm an AMD fan and I would not consider a NV GPU for myself. This is however different when recomending to family or other people, then I try to consider all of the big three.
                    I keep hearing that over and over except when RTX/DLSS are properly implemented they work beautifully:



                    AMD fans are a curious kind of people. Whenever NVIDIA invents something, it's a "gimmick", it "slows down everything", it doesn't make picture quality better.

                    Meanwhile NVIDIA invented and first implemented in HW:
                    • programmable shaders (now used by 99.9% of 3D games)
                    • tesselation (now used by absolute most triple-A titles)
                    • real time ray tracing (to become common once the new Xbox and PS get released).
                    And a lot more. NVIDIA -works (hair, grass, etc) were also panned to no end until AMD released GPUs which could handle them properly and nowadays no one really remembers the whole issue.
                    Last edited by birdie; 01-21-2020, 11:37 AM.

                    Comment

                    Working...
                    X