Announcement

Collapse
No announcement yet.

AMD Ryzen 7 5700G Linux Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    ur post is misleading

    APUs only have 8 lanes for a gpu slot

    Originally posted by domih View Post

    I ran the testing suites pts/compilation, pts/compression, pts/cryptography, pts/java, pts/python, pts/productivity, pts/unigine, pts/video-encoding and pts/cassandra using the 3400g with 2666 MT/s, 4750g with 2666 MT/s, 3200 MT/s and 3600 MT/s and 5700g with 3200 MT/s and 3600 MT/s all other things being equal (e.g. same mobo).

    What I saw:

    Yes there is a significant improvement from 3400g (Zen+) to 4750g (Zen 2) using identical memory.

    Yes there is a significant improvement from 4750g (Zen 2) and 5700g (Zen 3) using identical memory.

    Yes there is improvement from 2400/2666 to 3200 MT/s using identical CPU.

    No it is a diminishing return story above 3200 MT/s. The 3200 and 3600 exchanged barbs in both directions in all the testing suites and mostly within margin of error.

    CONCLUSION

    Very specifically memory focus benchmark can show high return on the dollar but with more application level benchmark the difference is within margin of error once you reached the recommended memory, 3200 MT/s in this example.

    Go with 3200 MT/s and rather spend the additional $$$ you could spend on a super-duper-bester-faster memory kit on the CPU unless you are super reach and you want a trophee PC with trophee components to go with your trophee wife.

    In games this translate in a few more frames here and there as shown there: https://www.youtube.com/watch?v=mi4eAWWWxGc

    IMHO these stories of high-end memory are borderline urban legends and I'm sure vendors are willing to propagate them so that they can sell higher priced kits.

    HTH,

    Disclosure: I'm not a gamer not a graphics oriented user. I like CPU with iGPU because I can use them even in mini-itx boards that offers x8 + x8 bifurcation on the single x16 PCIe slot and then plug in a fast network card (10 GbE or higher) and a true hardware RAID card. Such a DIY system SFF is then cheaper than these expensive QNAP, Synology, Terra Master turn-key appliances.

    Best.

    Comment


    • #32
      Originally posted by domih View Post

      This kind of situation always make me think about what a great new feature the Phoronix Test Suite could implement:

      Add 3 columns to the Test Results graphs:
      - %: the difference expressed in percentage. Personally anything under 5% is not worth $$$.
      - Comparison: based on the estimated margin of error: Significant, Borderline or Non-Significant difference.
      - Who will notice: Anybody, Power User, Professional, Expert

      I'm not a gamer, so I do not know how one feels about Frame Rates. But for a beginner gamer, is there really a difference of user experience between 125 and 150 FPS? or 60 and 90? I don't know. What I do know is that even for a 3 FPS difference while we are in the stratospheric 100 or above FPS there are Volga-river-long online discussion threads about which hardware is the "best of the best" based on the result. Really?

      Not talking about Phoronix here but how many times one can see online stuff like "hardware A" rates at 2387, while "hardware B" rates at 2490 and therefore B is so mucho much better, while in fact the difference is 4.3% and a lambda user will probably never see the difference.

      I don't even want to talk about charts where the origin is not zero in order to "highlight" the difference.

      Phoronix could show the way to the pseudo-science statisticians from a few major tech news sites (I'm not giving names) and restore some sanity (...)
      Yes, but it has diminishing returns and you have to be paired with a good monitor with minimal input lag. 60 to 120 offers a noticeable improvement in user experience and most everyone can tell the difference. Even 75hz and 90hz offer improved desktop experiences, smoother mouse movements, window dragging, and the like.

      Higher rates like 144hz, 240hz, etc, that's getting into the territory where you have to be a pro-gamer to tell and benefit from the speeds. The downside is there's a good chance you'll be playing the game on low-medium settings if you're at 2K/4K resolutions in order to push those framerates steady...and that's with beefy hardware.

      But like I said, if you don't pair any of that with a low input lag monitor, like 1-5ms response time, the benefits of higher framerate gaming are less apparent to negligible. If your monitor and input can't keep up then the increased framerate is next to useless. 60 with 15ms to 1-5ms is arguably better than 120 and 15ms.

      For casual gamers and people that aren't too picky, 45-120 FreeSync at up to 2K is a nice setup to aim for. Most all those monitors have decent enough response times. Not too expensive, obtainable with 2017+ hardware, and you can still play games with higher quality settings. Unless you're into eSports and fast-paced twichy games you really don't need above 120.

      Comment


      • #33
        Originally posted by domih View Post

        I ran the testing suites pts/compilation, pts/compression, pts/cryptography, pts/java, pts/python, pts/productivity, pts/unigine, pts/video-encoding and pts/cassandra using the 3400g with 2666 MT/s, 4750g with 2666 MT/s, 3200 MT/s and 3600 MT/s and 5700g with 3200 MT/s and 3600 MT/s all other things being equal (e.g. same mobo).

        What I saw:

        Yes there is a significant improvement from 3400g (Zen+) to 4750g (Zen 2) using identical memory.

        Yes there is a significant improvement from 4750g (Zen 2) and 5700g (Zen 3) using identical memory.

        Yes there is improvement from 2400/2666 to 3200 MT/s using identical CPU.

        No it is a diminishing return story above 3200 MT/s. The 3200 and 3600 exchanged barbs in both directions in all the testing suites and mostly within margin of error.

        CONCLUSION

        Very specifically memory focus benchmark can show high return on the dollar but with more application level benchmark the difference is within margin of error once you reached the recommended memory, 3200 MT/s in this example.

        Go with 3200 MT/s and rather spend the additional $$$ you could spend on a super-duper-bester-faster memory kit on the CPU unless you are super reach and you want a trophee PC with trophee components to go with your trophee wife.

        In games this translate in a few more frames here and there as shown there: https://www.youtube.com/watch?v=mi4eAWWWxGc

        IMHO these stories of high-end memory are borderline urban legends and I'm sure vendors are willing to propagate them so that they can sell higher priced kits.

        HTH,

        Disclosure: I'm not a gamer not a graphics oriented user. I like CPU with iGPU because I can use them even in mini-itx boards that offers x8 + x8 bifurcation on the single x16 PCIe slot and then plug in a fast network card (10 GbE or higher) and a true hardware RAID card. Such a DIY system SFF is then cheaper than these expensive QNAP, Synology, Terra Master turn-key appliances.

        Best.
        May I ask which mini-ITX board you chose? Most of them will not supply a full 16x lanes unless you disable the iGPU in the BIOS.

        Many boards will provide 8x lanes to the slot even though it is a 16x physical with the iGPU active. Therefore you are only getting a x4 + x4 bifurcation.

        Thanks.

        Comment


        • #34
          A Ryzen 5 5600G review would be great as well, to see how it compares to this one. Also, please consider testing it with Ubuntu 20.04.3. It would be useful to know if the "old" LTS is capable of handling these new CPUs.

          Comment


          • #35
            Originally posted by edwaleni View Post

            May I ask which mini-ITX board you chose? Most of them will not supply a full 16x lanes unless you disable the iGPU in the BIOS.

            Many boards will provide 8x lanes to the slot even though it is a 16x physical with the iGPU active. Therefore you are only getting a x4 + x4 bifurcation.

            Thanks.
            ASRock B550M-ITX/AC AM4

            BIOS allows x16, x8 + x8 or x4 + x4 + x4 +x4.

            Using a birfurcation adapter from https://c-payne.com/collections/pcie...furcation-pcbs.

            Ubuntu 20.04.

            Checked sudo lspci -vv -s xx:xx.x on the two slots, they report LnkSta: ..., Width x8 (ok)
            Last edited by domih; 27 August 2021, 12:05 AM.

            Comment


            • #36
              Originally posted by kneekoo View Post
              A Ryzen 5 5600G review would be great as well, to see how it compares to this one. Also, please consider testing it with Ubuntu 20.04.3. It would be useful to know if the "old" LTS is capable of handling these new CPUs.
              Yes, it does. As mentioned in the article <<...The only caveat that I've encountered is the Zen 3 APU temperature monitoring not coming until Linux 5.15...>> CPU temp does not show in the Mate sensor applet and is not available via /sys/class/thermal/thermal_zone0/temp. Same here.

              Otherwise, everything else I tried worked OK so far. Using Ubuntu 20.04.2 LTS.
              Last edited by domih; 27 August 2021, 12:20 AM.

              Comment


              • #37
                APUs only have 8 lanes for a gpu slot
                This time the 5000 apus do have the full set of pcie lanes, but it is limited to pcie 3, no v4.

                Comment


                • #38
                  Originally posted by domih View Post
                  I'm not a gamer not a graphics oriented user.
                  I'm both, and I still agree with your conclusion - though like all such, I don't use IGP in the first place.

                  Things do get funky past a certain point, as the IF ends up mattering more than the RAM timings, but the heart of what you say is true. What matters is that *slow* RAM is slow, but once you get past a certain point it's all basically a wash, other than - as you say - the e-peen of winning a 5-digit benchmark by 3 points.

                  Comment


                  • #39
                    Originally posted by domih View Post

                    Yes, it does. As mentioned in the article <<...The only caveat that I've encountered is the Zen 3 APU temperature monitoring not coming until Linux 5.15...>> CPU temp does not show in the Mate sensor applet and is not available via /sys/class/thermal/thermal_zone0/temp. Same here.

                    Otherwise, everything else I tried worked OK so far. Using Ubuntu 20.04.2 LTS.
                    Thanks, I was wondering how the missing temperature monitoring behaves - if it's about the whole package, the CPU or iGPU. Great news, though! It means I can keep looking for parts for my new PC. Now having a benchmark for 5600G would help my decision to getting one of the CPUs. The graphics is the same in both, but whether it's worth the extra $100 for 2 more cores... who knows. 6 cores are already great for many things, so I'll wait for more info to come out. Hopefully the prices will also turn out in my favor.

                    Comment


                    • #40
                      Originally posted by skeevy420 View Post

                      Yes, but it has diminishing returns and you have to be paired with a good monitor with minimal input lag. 60 to 120 offers a noticeable improvement in user experience and most everyone can tell the difference. Even 75hz and 90hz offer improved desktop experiences, smoother mouse movements, window dragging, and the like.

                      Higher rates like 144hz, 240hz, etc, that's getting into the territory where you have to be a pro-gamer to tell and benefit from the speeds. The downside is there's a good chance you'll be playing the game on low-medium settings if you're at 2K/4K resolutions in order to push those framerates steady...and that's with beefy hardware.

                      But like I said, if you don't pair any of that with a low input lag monitor, like 1-5ms response time, the benefits of higher framerate gaming are less apparent to negligible. If your monitor and input can't keep up then the increased framerate is next to useless. 60 with 15ms to 1-5ms is arguably better than 120 and 15ms.

                      For casual gamers and people that aren't too picky, 45-120 FreeSync at up to 2K is a nice setup to aim for. Most all those monitors have decent enough response times. Not too expensive, obtainable with 2017+ hardware, and you can still play games with higher quality settings. Unless you're into eSports and fast-paced twichy games you really don't need above 120.
                      There is diffrent types of response times.

                      When it comes to refresh rate and FPS
                      1000ms/60 = 16.6ms
                      1000ms/75 = 13.3ms
                      1000ms/120 = 8.3ms
                      1000ms/144 = 6,94ms
                      1000ms/165 = 6.06ms
                      1000ms/240 = 4.16ms

                      So yes there are diminishing returns, as going from 60 to 144hz is decreasing lag by almost 10ms while from 144 to 240 only 2.8ms.

                      But i do not fully agree about "low input lag" monitor.

                      There is 3 types of lag in case of monitor :
                      1st from computing and refresh rate and that simply the higher fps/refresh rate the better.
                      2nd type of lag is input lag of monitor
                      3rd is from response time of pixels.

                      The 1st and 2nd are actually the most important, because they are the full lag without any reaction to see. Response time of pixels is more complicated because pixels change over time. So when pixel for example turns from white to black in 4 ms, but changing from white to gray in that process takes only 2ms you could say that is already reaction just not finished.

                      And here it is not simply you won't see the diffrence - sure diffrence is less noticable but still that diffrence exist and it always pays off, just diffrence is smaller. For me i use higher refresh rate monitors, mostly because it actually gives me less headache, I have often headaches from some monitors 60 or even 50 hz, (Lenovo has idiotic system of decreasing refresh rate to 48/50hz for sake of saving power and it very fast causes headache, while overclocking display from 60 to 72hz caused for me measerable improvement on that front, nowadays i use desktop 165hz one monitor and 144hz second). Refresh rate is overall more important then response time as input lag, as refresh rate is responsible both for smoothness of animation and response time, while input lag only for response time.

                      And about freesync/g-sync it is important to know about ranges of refresh rate. eg. if monitor is 240 hz, but has freesync range of only 120-240hz, you gonna suffer at fps below 120.

                      Comment

                      Working...
                      X