Announcement

Collapse
No announcement yet.

AMD Radeon RX 6600 Linux Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by bridgman View Post



    Throttling issues are not related to higher clocks but rather a mismatch between the heat a chip generates and the heat that the card's cooling solution can dissipate... or as yump mentioned a mismatch between the heat that the card's cooling solution is dissipating and the case airflow's ability to exhaust that heat outside the case.

    Some reviewers explicitly warm up the cards under test before starting benchmarks, others like Phoronix run the tests back to back (so the card stays hot) and run them multiple times to achieve a similar effect. If you scroll down to the bottom of the article below (which has good temp/clock vs time graphs) you'll see that the engine clock stays pretty constant during a ~20 minute run despite temperature and fan speed increasing, which suggests that the card's cooling solution is doing its job:

    https://www.guru3d.com/articles-page...-review,6.html
    Interesting, but still begs the question why advertise boost clocks at all then?

    So if adequately cooled & not crammed into a small cube because HTPC, any GPU should be able to maintain its maximum boost clock indefinitely?

    Comment


    • #52
      Stagflation hardware.

      Comment


      • #53
        Originally posted by Linuxxx View Post

        Interesting, but still begs the question why advertise boost clocks at all then?
        I think it's because the board partners have to differentiate their products somehow, it's an easily accessible number, and there was a small period of time where "factory overclocked" cards could actually be faster by a decent margin.

        But realistically GPU buyers shouldn't be looking at anything other than performance benchmarks and thermal/acoustic tests. A GPU is a throughput machine, and clocks are an implementation detail.

        Originally posted by Linuxxx View Post
        So if adequately cooled & not crammed into a small cube because HTPC, any GPU should be able to maintain its maximum boost clock indefinitely?
        No, it will settle in at some frequency less than that, after a couple minutes or so. W1zzard over at techpowerup always provides good data on this in the later pages of his reviews.

        The boost clock is highest clock that appears in the firmware's voltage/frequency table -- the highest frequency that the DVFS governor can choose. Having a large range of frequencies available is useful because workloads are different. Some are able to keep every core in the GPU very busy and pull a ton of power (like furmark and some openCL stress tests), and some are not, perhaps beacuse they're limited by memory bandwidth, because they're poorly optimized, or just because of the nature of whatever they're calculating. If the GPU is running memory-bound code, it might be best to blitz through math and get back to waiting on memory as soon as possible, by running at a frequency that would burn up the chip in seconds if you fed it a high-power workload.

        Comment


        • #54
          Originally posted by Linuxxx View Post

          Interesting, but still begs the question why advertise boost clocks at all then?

          So if adequately cooled & not crammed into a small cube because HTPC, any GPU should be able to maintain its maximum boost clock indefinitely?
          AMD used to advertise "game clocks" for their cards - not sure if they still do or not. But it was what they expected the card to be able to maintain while playing games.

          Last time I checked, they were being fairly conservative with it and most games actually ended up hitting higher speeds. It varies on the game, though, because they all stress the cards in slightly different ways, so one might stabilize at 2.3Ghz and another game might hit 2.55Ghz.
          Last edited by smitty3268; 16 October 2021, 12:19 AM.

          Comment


          • #55
            Originally posted by Linuxxx View Post

            Interesting, but still begs the question why advertise boost clocks at all then?

            So if adequately cooled & not crammed into a small cube because HTPC, any GPU should be able to maintain its maximum boost clock indefinitely?
            Yeah, this would have been my response as well. If they can keep constant boost clocks at all times, then why have boost clocks in the first place?

            I am not buying Bridgman's reply, sorry. Video games require relatively constant max power from the graphics card. Sure, there are variations in framerates and load, but they don't make that much of a difference. The way i see it, boost clocks are meant to cheat in benchmarks.

            Oh, and as for prewarming the gpu or running games back to back, that argument' doesn't cut it either. Modern AAA games need loading levels in order to run benchmarks, and during those times the gpu powers down and re-cools itself. Even if Michael runs those back-to-back, unless he has found a way to load next game instantly while the other game finishes the benchmark, there is still time to cool the gpu enough so it can sustain boost clocks.

            Comment


            • #56
              Originally posted by TemplarGR View Post
              I am not buying Bridgman's reply, sorry. Video games require relatively constant max power from the graphics card. Sure, there are variations in framerates and load, but they don't make that much of a difference. The way i see it, boost clocks are meant to cheat in benchmarks.

              Oh, and as for prewarming the gpu or running games back to back, that argument' doesn't cut it either. Modern AAA games need loading levels in order to run benchmarks, and during those times the gpu powers down and re-cools itself. Even if Michael runs those back-to-back, unless he has found a way to load next game instantly while the other game finishes the benchmark, there is still time to cool the gpu enough so it can sustain boost clocks.
              When I look at the temperature / time graphs from a variety of reviews what I see is that the cooling solution heats up slowly and cools down slowly. The GPU doesn't heat up and start throttling in a couple of seconds, and when it is running at "as hot as it gets" temperature it doesn't cool down in seconds either.

              Did you get a chance to look at the Guru3D charts around the 19 minute mark where GPU activity drops to zero for a minute or two ? The GPU temperature drops a bit because of the thermal resistance between hot spot and die/heatsink but not that much... it takes longer than that for the heat pipes and heatsink (and to a lesser extent the surrounding air) to cool down.

              I'm not sure I understand your comments about running games back to back not being sufficient - are you saying that when running benchmarks the loading time is so much greater than the run time that the chip never has a chance to heat up ? I guess that is possible, but my impression was that typical benchmarks exercised the GPU for long enough to get pretty close to max temp, as long as the system was already warm from previous runs.

              Agree that running every benchmark with a cold system would probably not accurately reflect real results, but I don't think any of the reviewers are doing that if only because it would take too much time.

              Test signature

              Comment


              • #57
                At highest settings 1080p+smaa I am actually getting 172FPS average with a RTX3070 in Shadow Of The Tomb Raider.

                Comment


                • #58
                  This makes me happy about buying a Vega64 a few years ago.

                  Comment


                  • #59
                    Originally posted by Michael View Post

                    It's shown on the system table on the 2nd page
                    I had to explicitly right-click and "open image in a new tab" to actually load the table in a way that I could read, without doing that I couldn't see that there was any more information than was summarised in the rest of the article. Maybe the system configuration table inset could be a clickable link to open the svg version of the table on it's own

                    It's a 6GB version of the card for anyone else that cares.

                    Comment


                    • #60
                      Originally posted by Raka555 View Post
                      This makes me happy about buying a Vega64 a few years ago.
                      Yeah, same. Decent performance and mature drivers.

                      Comment

                      Working...
                      X