Announcement

Collapse
No announcement yet.

Intel Updates Alder Lake Tuning For GCC, Reaffirms No Official AVX-512

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by piotrj3 View Post
    People are like look at 12900k it draw 225 W! Yes, you bought enthusiast grade CPU so of course it runs on limits,
    No, this is not normal. Don't gaslight us into believing that the only problem with this picture are the expectations of the PC-buying public. Intel's desktop CPUs didn't boost anywhere near this high, until they went nuts with Comet Lake.

    This is called a slippery slope. Next, AMD is going to respond by juicing the hell out of their CPUs. As a result of this pissing match, consumers get louder PCs (or more expensive coolers), higher electricity & cooling bills, and billions of tons of carbon are going to get burnt for just a few % of peak performance. This is not a good state of affairs.

    Comment


    • #32
      Originally posted by CtrlAltShift View Post
      Linus can finally prize Intel for stepping away from AVX-512. I remember him bashing Intel for prioritizing AVX-512 over core performance and core count. So, I believe he got what he wanted.
      Not exactly. It's still in the silicon. This is not a retreat by Intel -- it's merely a "strategic redeployment" (metaphorically speaking).

      Comment


      • #33
        Originally posted by avem View Post


        At 160W it loses 17.6% of its performance, not 30%. Somewhere between 160W and 200W there's a sweet spot where the Intel CPU is faster and more efficient than 5900X. Yes, Intel overclocked the 12900K to hell to rival 5950X in lots of MT tasks but at 125W it beats 5950X hands down in absolute most low threaded tasks.
        You only quoted Intel power/performance. Where are the Ryzen numbers?

        Comment


        • #34
          Originally posted by uid313 View Post
          The AVX-512 is pretty useless for most people, it is only useful for very specific things. Personally, I don't care for AVX-512.
          Agreed. However, I'm not "most people", and I'm just saying I like having the option to use if, should the occasion arise.


          Originally posted by uid313 View Post
          The ADL comes with a iGPU unless it is the F model. The model requires a discreet GPU, but the non-F model has a integrated GPU but it is not as good as the Ryzen G series. I wish ADL had one with a good integrated Xe GPU.
          Yeah, I know about all of that. I wouldn't mind a bigger iGPU. The one in Tiger Lake is good for like 2.5 TFLOPS.

          Comment


          • #35
            Originally posted by avem View Post
            Have you actually read the entire tweet? There's no validation program for AVX-512. Meanwhile do you have actual proofs that an AVX-512 implementation in ADL computes incorrectly? No? Then WTF are you talking about?
            The point in the tweet was that Intel isn't validating ADL the AVX-512 silicon in CPUs, as they come off the assembly line. So, you could get a CPU with defects in some of its AVX-512 units and not know unless/until you discover your calculations have some errors.

            I think Ian's point was that someone should write a test program that runs through all the instructions & a bunch of corner cases, in order to validate that the AVX-512 in your CPU is defect-free, if you're going to use it for anything important.

            Originally posted by avem View Post
            WTF is wrong with people? Can you start discussing GCC patches for ADL and stop shitting on Intel and its new CPU lineup? Don't you have better things to do in life?
            As already pointed out, the article concerns the presence of AVX-512 & its support status in ADL as well. It's right in the headline. This post was entirely on-topic.

            Even the stuff about power-efficiency isn't too bad, for a Phoronix forums discussion.

            Comment


            • #36
              Originally posted by avem View Post
              Show me any official marketing materials prior to the ADL release where Intel promised or advertised AVX-512. Speaking frankly I don't understand what the heck you're talking about. Some people had some expectations based on some rumors and now those expectations aren't even that far off since you still can technically use AVX-512.
              It's basically unprecedented for them to remove major ISA extensions, right after introducing them. After Rocket Lake shipped with AVX-512, Intel had introduced it into all market segments, except for chromebooks (the tier of CPUs with only E-cores). So, I'm not saying they broke any sort of promise, but they certainly surprised a lot of folks with this move.

              Also, whether or not you can even enable AVX-512 is motherboard dependent. It was basically an easter egg that ASUS slipped into their BIOS. Depending on your motherboards, you might be SoL. And if you got one with a defect in one of its AVX-512 blocks, you're also SoL. So, you really can't play it like "Intel came through, anyhow". They didn't.

              Comment


              • #37
                Originally posted by coder View Post
                This is not accurate. I see you quoted from the same article, but maybe you missed the core-scaling chart:


                You really can't call burning 78 W (+71 W above idle) for a single-threaded task particularly efficient, much less "the most power-efficient". I mean, maybe you're joking but I'm not detecting a hint of sarcasm.

                That's about the same as the peak all-core load on a Ryzen 5600X uses, and you'd better believe that's going to deliver more aggregate performance.
                Elaborately choosing benchmarks, I see. In gaming ADL has shown to be faster and consume fewer watts than Ryzen 5900X/5950X CPUs. Again, ADL 12700K/12900K are not the best showing of the architecture and Intel OC'ed them to the absolute limit to rival CPUs with a lot more threads and cores. Has this ever happened before? I don't remember anything to this extent. An IPC increase is staggering.

                But yes if you want to hear it from me, by default 12700K/12900K are very power hungry specially in MT tasks. OK, you're happy? Can we now dissolve and stop discussing how horrible these CPUs are despite setting a few dozen records in various benchmarks?
                Last edited by avem; 11 November 2021, 05:51 AM.

                Comment


                • #38
                  Originally posted by coder View Post
                  You only quoted Intel power/performance. Where are the Ryzen numbers?
                  Yeah, it was clear from 1000 miles away that you came here to praise Ryzen and find whatever faults in ADL. Thank you for confirming that.

                  Comment


                  • #39
                    Originally posted by coder View Post
                    It's basically unprecedented for them to remove major ISA extensions, right after introducing them. After Rocket Lake shipped with AVX-512, Intel had introduced it into all market segments, except for chromebooks (the tier of CPUs with only E-cores). So, I'm not saying they broke any sort of promise, but they certainly surprised a lot of folks with this move.

                    Also, whether or not you can even enable AVX-512 is motherboard dependent. It was basically an easter egg that ASUS slipped into their BIOS. Depending on your motherboards, you might be SoL. And if you got one with a defect in one of its AVX-512 blocks, you're also SoL. So, you really can't play it like "Intel came through, anyhow". They didn't.
                    They have not formally introduced AVX-512 in their consumer lineup yet aside from a few mobile parts. You're blaming them for something they haven't done.

                    People who absolutely need AVX-512 will buy the known motherboards that allow to unlock AVX-512 - I see exactly zero issues with that. Again, you are carefully trying to create an issue with ADL which affects all 10 people in the world who are savvy enough to buy proper motherboards and choose CPUs carefully. Meanwhile 99.999% of people on earth have no freaking clue what AVX-512 is.
                    Last edited by avem; 11 November 2021, 06:04 AM.

                    Comment


                    • #40
                      Originally posted by coder View Post
                      The point in the tweet was that Intel isn't validating ADL the AVX-512 silicon in CPUs, as they come off the assembly line. So, you could get a CPU with defects in some of its AVX-512 units and not know unless/until you discover your calculations have some errors.

                      I think Ian's point was that someone should write a test program that runs through all the instructions & a bunch of corner cases, in order to validate that the AVX-512 in your CPU is defect-free, if you're going to use it for anything important.


                      As already pointed out, the article concerns the presence of AVX-512 & its support status in ADL as well. It's right in the headline. This post was entirely on-topic.

                      Even the stuff about power-efficiency isn't too bad, for a Phoronix forums discussion.
                      Again, the feature is not advertised, zero reviewers even talk about it as a competitive advantage in their conclusions, I'm not sure why you continue to argue and with whom exactly. AVX-512 is there for all the 20 people in the world who will use it and report whether it works and computes correctly. You started off with almost implying that AVX-512 is an outright broken feature and blaming Intel for that, now changing the tone a tiny bit, but "you could get a CPU with defects".

                      People often buy CPUs/GPUs which are DoA despite having passed QA/QC. So what?

                      Comment

                      Working...
                      X