Announcement

Collapse
No announcement yet.

Intel Updates Alder Lake Tuning For GCC, Reaffirms No Official AVX-512

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Intel Updates Alder Lake Tuning For GCC, Reaffirms No Official AVX-512

    Phoronix: Intel Updates Alder Lake Tuning For GCC, Reaffirms No Official AVX-512

    Posted last year for introduction in the GCC 11 stable compiler released earlier this year was the initial Alder Lake "alderlake" target. Now that Intel 12th Gen Core "Alder Lake" processors are officially out, Intel engineers have updated their Alder Lake tuning for the GNU Compiler Collection to yield more efficient performance with GCC 12 due out in Q2'2022...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Pretty cool, that was quick.

    Has anyone (Michael included) been using Alder Lake day to day? How's it going? Any noticeable "wow" moments or anything? Curious if it's a game-changer by any means like Apple's M1.

    Comment


    • #3
      Unfortunately Intel "Alder Lake" is very power hungry and difficult to cool because it runs very hot.
      Also while the Intel Xe graphics is very competent, they use few units the in Alder Lake which makes its graphics performance much worse than AMD Ryzen G-series.

      Comment


      • #4
        I remember a few years back there were people in these forums and a few other tech discussion sites loudly proclaiming that Intel was crushing AMD because of Intel processor AVX-512 support. They said it didn't make sense for anyone to get AMD parts because they didn't have it, and AVX-512 was - in their minds - critical for lots of software. So were they fanboys, or paid shills? (Probably just fanboys, but it was mind-boggling to see people insist this feature was so important when it was brand new and nothing used it.)

        My daughters' bedroom is the coldest room in the house over winter, maybe I can get an Alder Lake part and run BOINC on it in there for six months out of the year.
        (I'm kidding. If I was going to do that, I'd go the cheap route and get two or three AMD Bulldozer boxes off Craigslist.)

        Comment


        • #5
          Originally posted by Michael_S View Post
          My daughters' bedroom is the coldest room in the house over winter, maybe I can get an Alder Lake part and run BOINC on it in there for six months out of the year.
          A couple years ago, I read about a company that built crypto mining machines shaped like old style steam radiators. I think they would literally pay you to use it to heat your home.

          I used to joke that computers are heaters that do computation as a side-effect, but this made it all too literal!

          Comment


          • #6
            Originally posted by uid313 View Post
            Unfortunately Intel "Alder Lake" is very power hungry and difficult to cool because it runs very hot.
            Not to make any excuses for Intel, but you don't have to cool it as much as it wants. If you use a less capable cooler, the tradeoff is just that the CPU fan spins up quicker and it throttles more readily.

            I'm contemplating an upgrade to Alder Lake. If I do, I'll probably tweak around with the scheduler and frequency settings to tradeoff a bit of boost for less fan noise and lower room temps in the summer. I'm sure it'll still be plenty fast and decent value for money.

            I can hear you thinking: "why don't you just get AMD, then?" But I've long had a primary and secondary Linux box that were Intel and AMD, respectively. I kind of like to keep one of each, especially if I get the chance to switch on/off AVX-512, should the occasion arise.

            Comment


            • #7
              Originally posted by coder View Post
              I used to joke that computers are heaters that do computation as a side-effect, but this made it all too literal!
              I had an epiphany the other day when overclocking and benchmarking. Just trying different voltages from 1.0V to 1.45V and seeing how the power usage was affected, benchmarks, etc. I learned a lot

              It's all about efficiency. Heat is just the by-product of the CPU crunching away (obviously). If you're unnecessarily pumping 1.35V on a 4.3 frequency, a lot of that is wasted as heat and you have to use a heatsink and fan to blow it all away

              Or, take the opposite approach and settle for a 4.2GHz overclock at the lowest possible voltage, say 1.25V. Then, you barely produce any "wasted" heat, your power usage is low, less work for the heatsink/fan to do because the CPU is producing less unnecessary heat (but still running at the same frequency).

              I know it's nothing new or novel, but an important reminder. Go for lower voltage, not highest overclock with an unreasonably high voltage for 24/7 runtime. (Not to say you *can't* do it, but it's not the most efficient or smartest move for the majority of people).
              Last edited by perpetually high; 13 November 2021, 10:27 AM.

              Comment


              • #8
                Originally posted by uid313 View Post
                Unfortunately Intel "Alder Lake" is very power hungry and difficult to cool because it runs very hot.
                This can be trivially fixed by setting PL1/PL2 limits. It's power hungry only for massively parallel computational tasks. In every day common tasks it's the most power efficient high performance x86-64 CPU out there.

                Originally posted by uid313 View Post
                Also while the Intel Xe graphics is very competent, they use few units the in Alder Lake which makes its graphics performance much worse than AMD Ryzen G-series.
                Completely different classes of devices. The Ryzen G lineup is not meant to run with a discrete GPU, while ADL implies a discrete GPU.

                Comment


                • #9
                  Originally posted by Michael_S View Post
                  I remember a few years back there were people in these forums and a few other tech discussion sites loudly proclaiming that Intel was crushing AMD because of Intel processor AVX-512 support. They said it didn't make sense for anyone to get AMD parts because they didn't have it, and AVX-512 was - in their minds - critical for lots of software. So were they fanboys, or paid shills? (Probably just fanboys, but it was mind-boggling to see people insist this feature was so important when it was brand new and nothing used it.)

                  My daughters' bedroom is the coldest room in the house over winter, maybe I can get an Alder Lake part and run BOINC on it in there for six months out of the year.
                  (I'm kidding. If I was going to do that, I'd go the cheap route and get two or three AMD Bulldozer boxes off Craigslist.)
                  Why has this discussion instantly turned into throwing shit, fanboyism and mockery? This news post is about GCC patches for ADL for Christ's sake. Vent your dissatisfaction with fanboys you met many years ago somewhere else please.

                  Comment


                  • #10
                    Originally posted by perpetually high View Post

                    I had an epiphany the other day when overclocking and benchmarking. Just trying different voltages from 1.0V to 1.45V and seeing how the power usage was affected, benchmarks, etc. I learned a lot

                    It's all about efficiency. Heat is just the by-product of the CPU crunching away (obviously). If you're unnecessarily pumping 1.35V on a 4.3 frequency, a lot of that is wasted as heat and you have to use a heatsink and fan to blow it all away

                    Or, take the opposite approach and settle for a 4.2GHz overclock at the lowest possible voltage, say 1.25V. Then, you barely produce any "wasted" heat, your power usage is low, less work for the heatsink/fan to do because the CPU is producing less unnecessary heat (but still running at the same frequency).

                    I know it's nothing new or novel, but an important reminder. Go for lower voltage, not highest overclock with an unreasonably high voltage for 24/7 runtime. (Not to say you *can't* do it, but it's not the most efficient or smartest move for the majority of people).
                    I would even add to it : Intel generally has much bigger margins of overclocking, but also bigger margins of potentially undervolting. I was doing it on my laptop in the past, and results were extremly impressive, I dropped CPU power draw by like 20% and at the same time perfromance raised a bit because CPU could turbo for longer thanks to smaller power draw.

                    People are like look at 12900k it draw 225 W! Yes, you bought enthusiast grade CPU so of course it runs on limits, and it is peak power draw. But if you take 12600k or 12700k, power efficiency wise it is on par with Zen 3 CPUs like 5600x/5800x and undervolting Intel CPUs is fairly trivial, while Ryzens generally undervolt badly and because of how they work you also lose performance while doing so.

                    Not to mention Intel has totally diffrent aproach towards K serie and non-K.

                    Eg. 11600k cinebench multi core - 4352 points
                    11600 multi core - 4265 points

                    Now what is 11600 power draw after first short term turbo finishes - 65W.
                    Last edited by piotrj3; 10 November 2021, 10:18 AM.

                    Comment

                    Working...
                    X