Announcement

Collapse
No announcement yet.

Intel-Powered Aurora Supercomputer Breaks The Exascale Barrier

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by 3lfk1ng View Post
    Not to mention that AMD first broke the Exascale barrier exactly 2 years ago.
    For comparison:
    Frontier - 22.7MW power consumption
    Aurora - 38.7 MW power consumption
    Frontier - 9,472 CPUs
    Aurora - 24,218 CPUs
    Frontier - 37,888 GPUs
    Aurora - 63,744 GPUs
    Frontier - 1,206.00 Rmax (62.86 gigaflops/watt)
    Aurora - 1,012.00 Rmax (26.15 gigaflops/watt)
    Imagine consuming almost twice the power, with almost twice as many processors, and still getting less than half the performance per watt.
    Efficiency remains an issue Intel desperately needs to solve.
    intel clearly lost the chip war... last time someone lost the chip war was in 1980 the UDSSR lost the chip war.
    Phantom circuit Sequence Reducer Dyslexia

    Comment


    • #12
      Originally posted by qarium View Post

      last time someone lost the chip war was in 1980 the UDSSR lost the chip war.
      ???

      Rather RISC CPU like MIPS, SunSPARC, DEC Alpha, PowerPC lost with Intel Pentium.

      And AMD lost GPU battle with Nvidia.

      intel clearly lost the chip war.
      You can look at it this way: this is Intel's first approach to high-performance GPUs. The GPU war for Intel is just beginning. The results will be known in 10-15 years.

      For the time being, the winning company is Nvida, which has about 96% share of the high-performance GPU market and has profits from its DataCenter division 20 times greater than AMD's - reflecting market capitalization.​

      Comment


      • #13
        Originally posted by HEL88 View Post
        And AMD lost GPU battle with Nvidia.
        If you rewind back to RX 6950X vs. RTX 3090, you'll see that Nvidia's lead isn't as unassailable as you seem to believe. Sure, RDNA3 underperformed and RTX 4000 performed either on par or better than expected, but RDNA4 is supposed to address what went wrong with RDNA3 and therefore might be more competitive.

        In terms of HPC, MI250X significantly outperformed the A100 (its contemporary) on fp64, although the H100 retook the lead there. Then, the MI300X took back the overall lead from the H100! AMD is definitely on the right track.

        Originally posted by HEL88 View Post
        You can look at it this way: this is Intel's first approach to high-performance GPUs. The GPU war for Intel is just beginning. The results will be known in 10-15 years.
        The way I look at it is that Intel has been building server CPUs for decades and yet got trounced by AMD, in recent years. Especially on energy-efficiency.

        Now, if you want to talk GPUs, Ponte Vecchio was insanely over-ambitious in the way it used tiles and die-stacking. They really scored an own-goal, with that one. Being their first HPC GPU really isn't a valid excuse, either, for a company with such deep expertise in building chips. Not to mention that Xeon Phi was designed to play in that same market segment.

        Comment


        • #14
          Originally posted by sophisticles View Post
          Does anyone else see the irony of the Department of Energy "sponsoring" i.e. paying 500 million dollars for a supercomputer that uses 38.7MW?
          Probably they use it to calculate how much energy everyone is wasting and tell the Americans that they need to save power. And that Intel needs another 100 billion tax money to improve their efficiency.

          Comment


          • #15
            Originally posted by coder View Post

            Analysts Estimate Nvidia Owns 98% of the Data Center GPU Market | Extremetech

            ​AMD has been in the GPU market forever. And how much of the data center market do they have? How much does Nvidia have? AMD has been crushed in this battle.

            AMD is closer to Intel in this segment. To Intel, which is just starting to enter this market.

            AMD and Intel have made similar money in the DataCenter segment. Meanwhile, Nvidia earned 20 (!!!) times more in this segment.

            The biggest loser is AMD. Because, as I mentioned, this company has been in the GPU market forever, and its share is embarrassingly low.​

            Comment


            • #16
              Originally posted by HEL88 View Post
              Analysts Estimate Nvidia Owns 98% of the Data Center GPU Market | Extremetech

              ​AMD has been in the GPU market forever. And how much of the data center market do they have? How much does Nvidia have? AMD has been crushed in this battle.
              The same could be said for AMD in the CPU market, until the last couple generations. These things can change. It just so happens that it's harder for AMD to take and hold the lead on GPUs, because Nvidia is a much fiercer competitor than Intel's CPUs have generally been.

              Again, I point to the fact that AMD had recent wins as evidence that Nvidia isn't completely beyond their reach.

              Originally posted by HEL88 View Post
              ​Meanwhile, Nvidia earned 20 (!!!) times more in this segment.
              Sure, because it was in the right place, at the right time, when the AI market really went nuts. That doesn't mean their hardware is 20x better!

              Originally posted by HEL88 View Post
              ​​The biggest loser is AMD. Because, as I mentioned, this company has been in the GPU market forever, and its share is embarrassingly low.​
              They know that, but were teetering on the brink of bankruptcy only about 7-8 years ago. It's really hard to make the necessary investments, in those circumstance. They've also flip-flopped on strategy, which hasn't helped. I do think they're catching up, though.

              Comment


              • #17
                Originally posted by coder View Post
                These things can change.
                Of course. But it's AMD fanboys write:
                intel clearly lost the chip war.... the last time someone lost the chip war was in 1980 the UDSSR lost the chip war.
                When Intel still has higher sales in the laptop (80% of the market), desktop (76%) and server (also 76%) segments (- the data comes from the Mercury report).

                And just started to enter the GPU market. And it has practically the same position in Data Center GPU as AMD, which has been in this market from the very beginning .​​

                The only loser is AMD in Data Center GPU, because it has practically zero share.

                That doesn't mean their hardware is 20x better!
                Sure. E.g. DEC Alpha was even better than Intel Pentium. You can say that over its grave.

                Comment


                • #18
                  What's the benefit of these super computers? Show off?

                  Comment


                  • #19
                    Originally posted by Blademasterz View Post
                    What's the benefit of these super computers? Show off?
                    • inaccurate weather forecast
                    • mass surveillance of innocent citizens
                    • cracking encryption
                    • less atomic bomb tests in your neighborhood
                    Plus they help predict the exact date of human extinction.

                    Comment


                    • #20
                      Originally posted by coder View Post
                      The same could be said for AMD in the CPU market, until the last couple generations. These things can change. It just so happens that it's harder for AMD to take and hold the lead on GPUs, because Nvidia is a much fiercer competitor than Intel's CPUs have generally been.
                      AMD could easily compete in the GPU market if they wanted to, they don't want to.

                      Competing effectively in the GPU powered HPC and AI markets would be in direct conflict with their obvious strategy of competing in those markets with more CPU cores.

                      AMD, under Lisa Su, made the decision that the path to riches and dominance was "more cores" and they have been doggedly pursuing that goal since.

                      Intel has not been able to compete on the "more cores" front and so decided to attack the GPU segment.

                      NVIDIA years ago wanted to compete in the x86 CPU market but Intel managed to stop them by giving them 1.5 billion dollars.

                      NVIDIA took that money and created the GPGPU market with CUDA and 1.5 billion dollars buys you lots of advancement.

                      Now that NVIDIA is going to be entering the desktop and server CPU markets, eventually, maybe in a decade, what I think will happen is NVIDIA will try to kill the discrete GPU market by producing hybrid CPU/GPU chips that blur the lines between what we have now and will be more lucrative thanks to simplified production pipelines and more appealing to programmers due to simpler leveraging of GPU capabilities.

                      Comment

                      Working...
                      X