Announcement

Collapse
No announcement yet.

Apple M1 Ultra With 20 CPU Cores, 64 Core GPU, 32 Core Neural Engine, Up To 128GB Memory

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by tildearrow View Post
    May be an incredibly powerful processor, but too bad it's only on the least Linux-friendly machines ever.
    As someone already link before. Someone's who knows what he's doing is working on it. That someone knows it's very doable to bring linux into these devices, that's why he's working on it. Someone as experienced as him would know to not waste energy to something unworkable. Maybe he's also wont start a project that doom to fail and would ruin his own reputation.

    Comment


    • #32
      Originally posted by Slartifartblast View Post
      All the credit goes to ARM for their core processor design and TSMC for their excellent 5nm process node. Lastly an honourable mention to Apple for adding a bit of tinsel on top.
      Wow, you couldn't be more wrong. ARM had nothing to do with the development of this microarchitecture, that was all apple. ARM merely invented the spec at the very top layer.

      ARM *wishes* they could design a chip this fast. It kicks the crap out of every core ARM has ever come up with.

      Comment


      • #33
        Originally posted by tildearrow View Post
        May be an incredibly powerful processor, but too bad it's only on the least Linux-friendly machines ever.
        Yeah, right. /s

        Asahi linux is nearing the alpha release of full linux support on the M1, and this new chip will only require a few small devicetree changes.

        Comment


        • #34
          Originally posted by Developer12 View Post
          Asahi linux is nearing the alpha release of full linux support on the M1, and this new chip will only require a few small devicetree changes.
          Are M1's CPU internals really the challenge though with widely known ARM specs? Isn't the GPU portion of the M1 a bigger challenge?

          Without specifications, can anybody make M1 GPU/Vulkan/OpenGL/etc drivers that rise any higher than Nouveau considering the same difficulties? For example, can we do re-clocking on the M1 GPU? We can't do that with Nouveau on newer GPUs to my knowledge.

          I'm not being critical, I honestly don't know so I'm curious about it. Perhaps the M1 GPU is worse in one respect though. Nvidia at least publishes a binary Linux driver to achieve full performance. Does Apple publish a binary Linux GPU driver?
          Last edited by ezst036; 08 March 2022, 09:31 PM.

          Comment


          • #35
            Originally posted by jacob View Post
            For those who want an open system ... does this have any potential advantage over, say, a Threadripper?
            Advantages? Over Threadripper? Which has vendor-supported opensource drivers? Seems unlikely.

            Comment


            • #36
              Originally posted by Developer12 View Post
              Yeah, right. /s

              Asahi linux is nearing the alpha release of full linux support on the M1, and this new chip will only require a few small devicetree changes.
              Well, the vendor is at best, Linux-agnostic, are they not? I mean, has Apple even lifted its little finger to help with the Linux support?

              I wouldn't be the slightest bit surprised, if future machines were unbootable with anything besides a signed Apple OS image. Anyone involved in this project has to be aware that they're camping on Apple's lawn. They're potentially just a single hardware rev away from getting evicted.
              Last edited by coder; 08 March 2022, 09:43 PM.

              Comment


              • #37
                Damn, just damn, this is some insane hardware, if I could use it on any sensible operating system i wouldn't be able to resist buying it.

                Apple is new to the hardware market, and they're shooting themselves in the foot by limiting their hardware to a borked operating system, macos is possibly among the worst operating systems in history...

                I just have too many reasons not to use MacOS. If it only worked on Linux, and at full performance too, I wouldn't be able to resist buying this shit.

                Comment


                • #38
                  I don't like an operating system, it is must be the worst operating systems in history!

                  Comment


                  • #39
                    Originally posted by Markopolo View Post
                    The only disappointing thing for me to is the relatively low core counts.
                    Let's remember that the single-thread performance is awesome, and most of what most people still wait for when they're waiting on a CPU benefits from fewer-but-faster threads. Multiprocessing is cool, and it's needed, but I'll take a two or four core system that churns a bazigaflop over one that needs twelve cores to do the same.

                    Comment


                    • #40
                      I think Apple went very high on marketing here. RTX3090 has 940GB/s memory bandwidth only for itself (RAM is seperate thing for GPU) and without NUMA like topology between few chips and RTX3090 is actually handicapped often by memory speed even in games (and productivity way more).

                      2nd. For Intel they picked overclockable CPU that is designed to be overclocked and to make good overclockable chip you need relativly high wattage leakage. This is why 12900k is less energy efficient then top of the line AMD cpus, but if you compare more mild designs of Intel like 12400 (non-k) vs AMD 5600, Intel wins in efficiency easly. It is very funny to read this graph when you know Intel recently posted similar type of graph for Laptop CPUs (which are not designed to be overclocked) and compared it succesfully to Apple's laptop M1s.

                      3rd. What about raytracing apple? What about AI denoising like Optix or Intel's solution. What about very poor ecosystem for GPU programming with very small number of libs (comparing to let's say CUDA)?

                      4th. Current GPU designs are fairly long on market (more then 2 years in fact). Comparing them to next Nvidia 5nm might make Apple look like a fool very very fast.

                      Comment

                      Working...
                      X