Announcement

Collapse
No announcement yet.

Apple M1 Ultra With 20 CPU Cores, 64 Core GPU, 32 Core Neural Engine, Up To 128GB Memory

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #81
    Regarding "mobile SoCs" I was referring to your mentioning of TSMC N7 based Apple chips used in phones and tablets (ultra low wattage) not the M1.

    I understand the importance of ISA, uarch and SoC arch, but N5 was also very important factor for enabling the latter ones. I had no intention to imply that it's all because of the node. Obviously, that's not the case.

    All in all interesting insights on your part, though personally I do not think that Intel/AMD will start transition so soon. If they do, they better have some pretty flawless full x86 emulation working

    Anyway, I have no fundamental disagreement with your takes, but I'd say I'm not as hyped for M1 (and ARM based SoC in general) as other IT people. The fact (AFAIK) M1 ARM ISA has no SVE/SVE2 extensions makes me it somewhat "morally dated". Also the lack of high performance and affordable ARM based desktops and especially laptops in Windows/Linux ecosystem partially kills the enthusiasm for me. But I'm not "ideologically" against this ISA, I'm more raw perf/dollar linux hardware guy, and it seems that x86 gonna remain main option for this criteria for some time.
    ​​
    Last edited by drakonas777; 10 March 2022, 06:12 PM.

    Comment


    • #82
      Originally posted by drakonas777 View Post
      personally I do not think that Intel/AMD will start transition so soon. If they do, they better to have some pretty flawless full x86 emulation working
      They can overlap product lines. They had a false start, on this front, several years ago.


      That used 8x Cortex-A57 cores and was socket-compatible with their x86 Opteron server CPUs. Jim Keller even headed up the design of a full-custom ARM core, the K12, before his 2015 departure. It got canned, probably because AMD realized they had gotten ahead of the market and had too few resources to deliver both that CPU and Ryzen/EPYC.

      I guarantee that both AMD and Intel are already looking beyond x86, internally. Hard to say whether it'll be ARM or RISC-V, though. Before Nvidia tried to buy ARM, it would've been a no-brainer. It's got all the momentum, in the server market, as well as mobile and for things like self-driving cars. However, Nvidia's move might've caused them to falter on that path.

      They won't announce anything until they sense the market is at a tipping point, because doing so would shake customer confidence in their commitment to x86. But, when the market turns decidedly against x86, don't think for a minute that either of them will go down with that ship.

      Originally posted by drakonas777 View Post
      I'd say I'm not as hyped for M1 (and ARM based SoC in general) as other IT people.​​
      I think it's technically interesting. For me, it's as relevant as a research prototype. There's zero chance I'll ever buy one.

      On the plus side, maybe it spurs Intel, AMD, or others to embrace in-package memory sooner than they otherwise would've. Intel had already announced a version of Sapphire Rapids Xeon with HBM, but that's a high-end server processor and this isn't.

      Comment


      • #83
        Originally posted by coder View Post

        I think it's technically interesting. For me, it's as relevant as a research prototype. There's zero chance I'll ever buy one.
        Personally I am extremely tempted to buy one if it ends up getting really good Linux support, when it comes to laptops there really is no competition yet. I guess the interesting and problematic bit will be if open source software will actually end up being written to take advantage of the accelerators on Linux, that to me will be the final blow.

        The other elephant in the room which I think a lot of people are missing is that Apple has showed that x86 emulation is viable and effective which actually paves the way for a transition away of x86 when the time comes. Apple's solution of having dedicated hardware emulation for the most common/bottleneck x86 instructions and then using software emulation for the other instructions that don't need to be so fast is really impressive. The only issue M1 emulation appears to have is that it doesn't emulate x86 games well.

        The only sad part is that I really hoped Apple would have hopped onto the Vulkan bandwagon, that would have made things so much easier. In terms of their hardware and pricing, the only unfortunate thing is that have stuck to HDM 2.0 even though their GPU can easily handle 2.1 speeds.

        Comment


        • #84
          Originally posted by mdedetrich View Post

          Personally I am extremely tempted to buy one if it ends up getting really good Linux support, when it comes to laptops there really is no competition yet. I guess the interesting and problematic bit will be if open source software will actually end up being written to take advantage of the accelerators on Linux, that to me will be the final blow.

          The other elephant in the room which I think a lot of people are missing is that Apple has showed that x86 emulation is viable and effective which actually paves the way for a transition away of x86 when the time comes. Apple's solution of having dedicated hardware emulation for the most common/bottleneck x86 instructions and then using software emulation for the other instructions that don't need to be so fast is really impressive. The only issue M1 emulation appears to have is that it doesn't emulate x86 games well.

          The only sad part is that I really hoped Apple would have hopped onto the Vulkan bandwagon, that would have made things so much easier. In terms of their hardware and pricing, the only unfortunate thing is that have stuck to HDM 2.0 even though their GPU can easily handle 2.1 speeds.
          1. Unfortunatelly, Linux is the worst OS if you want it to use the hardware acceleration for this and that. Unless you're an enthusiaist, go with the raw power.
          2. x86 emulation in M1 is so many times faster than on Windows for ARM and Linux/ARM, because it 1) doesn't emulate the x86 memory model, it actually runs it directly in hardware (no need to insert many memory bariers in the created x86 code), 2) compiles the executable Ahead of Time (AoT) when you run it the first time (no translation on-the-fly, no profiling and re-compiling with more optimizations - well, it actually supports on-the-fly translations for self modifying programs). What games don't work there? I saw Youtube videos where people play x86 AAA games in the virtualized Windows for ARM.
          3. a) Vulkan was released a year after Metal, so it's not Apple's fault. They couldn't wait, they would lose the leading position in the competition.
            b) HDMI is there only for specific purposes. For the real thing you use DisplayPort.

          Comment


          • #85
            Originally posted by mdedetrich View Post
            when it comes to laptops there really is no competition yet.
            Maybe Qualcomm/Nuvia will offer something competitive. Too bad Qualcomm seems to be chasing the premium market. If I get an ARM laptop, it'd probably be some unremarkable Mediatek-powered thing.

            Originally posted by mdedetrich View Post
            The other elephant in the room which I think a lot of people are missing is that Apple has showed that x86 emulation is viable and effective
            If I buy an ARM machine, I don't expect to be running x86 on it. Anyway, I like JIT translation better. Techniques like memory-renaming should be more feasible, getting you closer to the performance of natively-compiled code.

            Originally posted by mdedetrich View Post
            the only unfortunate thing is that have stuck to HDM 2.0 even though their GPU can easily handle 2.1 speeds.
            I'm with you, on that. Obviously, they would rather people use Thunderbolt.

            Comment


            • #86
              Originally posted by Ladis View Post
              1. a) Vulkan was released a year after Metal, so it's not Apple's fault. They couldn't wait, they would lose the leading position in the competition.
                b) HDMI is there only for specific purposes. For the real thing you use DisplayPort.
              Sorry, they have no good excuse not to support Vulkan. AMD and Intel both have Vulkan drivers I'm sure they could easily port to MacOS.

              As for HDMI, if you want to use your Mac with a TV, none that I've seen have a DisplayPort or Thunderbolt input. So, they shouldn't be trying to force Thunderbolt down people's throats.

              Comment


              • #87
                Originally posted by coder View Post
                Sorry, they have no good excuse not to support Vulkan. AMD and Intel both have Vulkan drivers I'm sure they could easily port to MacOS.

                As for HDMI, if you want to use your Mac with a TV, none that I've seen have a DisplayPort or Thunderbolt input. So, they shouldn't be trying to force Thunderbolt down people's throats.
                AMD and Intel don't make GPUs for Apple anymore. And I connect my phone (Android) and laptop (Windows) to TV wirelessly (DLNA and WiDi/Miracast), probably similar to Apple users.
                Last edited by Ladis; 11 March 2022, 02:19 AM.

                Comment


                • #88
                  Originally posted by Slartifartblast View Post
                  All the credit goes to ARM for their core processor design and TSMC for their excellent 5nm process node. Lastly an honourable mention to Apple for adding a bit of tinsel on top.
                  It's been fun winding up the Apple fanboys, see you at Apple Con next year

                  Apple-Fanboy-Rimmer.jpg

                  Comment


                  • #89
                    Originally posted by Ladis View Post
                    AMD and Intel don't make GPUs for Apple anymore.
                    You made a historical point, dating back to the introduction of Vulkan. At that point, they did.

                    Also, the x86 Mac Pro is still being sold and yet to be replaced. And we don't know if Apple will definitely eliminate AMD from its replacement, although it's a good bet.

                    Originally posted by Ladis View Post
                    And I connect my phone (Android) and laptop (Windows) to TV wirelessly (DLNA and WiDi/Miracast), probably similar to Apple users.
                    Some people use TVs as an inexpensive, large-format display. Or, for Apple's use case of video production, you'd want to use a real TV for mastering.

                    Comment


                    • #90
                      Originally posted by Slartifartblast View Post
                      It's been fun winding up the Apple fanboys, see you at Apple Con next year
                      What an age we live in, when someone would rather admit to being a troll than the ignoramus we all know him to be. You realize that makes you an ignorant asshole, right?

                      A lot of us aren't Apple fans. I've never owned an Apple product in my life, and I'm not about to start. For me, the best Mac laptop or desktop would be no more useful than a merely average Linux machine. And then there's everything I don't like about Apple, as a company, and its walled garden.

                      I just think what they're doing is interesting, as it proves what's possible. And I like to see credit paid where it's due, even if it's through clenched teeth. Like them or loathe them, Apple has some of the world's best and most resourced chip designers.

                      Comment

                      Working...
                      X