Announcement

Collapse
No announcement yet.

Apple Announces The M4 Chip With Up To 10 CPU Cores

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Jumbotron View Post
    Apple proves once again that it is the most sophisticated CPU company in the world surpassing Intel, AMD and Nvidia. No other company in the world is capable of producing a CPU with a desktop performant integrated GPU as well as the world’s most performant NPU on a consumer device all tied together with a zero copy heterogeneous memory protocol. All on the smallest node with the best compute power to power draw efficiency of any platform from the Big 3. The only other company even close to being capable of this feat is Qualcomm and that’s only because they bought Nuvia three years ago and Nuvia was founded by ex-Apple Silicon engineers and designers. And when I say the Big 3 are not capable of producing a chip as capable and on such a small node and with such power efficiency that it needs no fan I really mean that Intel and AMD and Nvidia are technically incapable of producing a chip like Apple Silicon’s M4.
    Why switch from Linux to macOS? It's like asking a fish to climb a tree when it's already the king of the sea. Linux is the **open-source utopia**, a playground for the brave, where you're only limited by your own ingenuity. It's the digital equivalent of a Swiss Army knife in a world of single-function gadgets.

    Enter the M4 chip, Apple's latest marvel, flexing its **10-core CPU** and **10-core GPU** muscles like it's preparing for the Silicon Olympics. It's got more cores than an apple orchard and a **Neural Engine** that's so quick, it could calculate the meaning of life before you finish this sentence. But let's not forget the **Dynamic Caching** and **hardware-accelerated ray tracing**—features that make gamers and creatives drool more than a teething baby.

    And then there's Asahi Linux, the daring project that's trying to put Linux on Apple silicon. It's like fitting square pegs into round holes, but somehow, they're making it work. Sure, it's still in the alpha stage, which in software terms means "expect the unexpected," but it's the spirit of adventure that counts.

    So, why stick with Linux when macOS beckons with its sleek interface and the M4's brute force? Because Linux isn't just an OS; it's a statement. It's for those who find beauty in a command line and freedom in a kernel. macOS might offer a smooth ride, but Linux lets you build the car... and the road... and anything else you fancy.

    In the end, choosing between Linux and macOS is like choosing between a DIY rocket kit and a first-class ticket to the moon. Both will get you to the stars, but only one lets you tinker with the thrusters. zz0.pfyiakfuidzz

    Comment


    • #22
      Originally posted by Volta View Post

      You don't say? Macos can't utilize full CPUs power in anything, so try again with better analogy. It's like buying Ferrari and having old grandpa as a driver. And this grandpa isn't Enzo.
      I'm doing compiles from a UTM Linux VM that can use all performance cores. And it definitely is faster (and was much more expensive, true) than my old Ryzen - I couldn't believe that. Maybe it's subjective, but for sure it doesn't feel in any way old.

      And for Lightroom under Mac, it's no comparison to LR for Windows. At all. I wouldn't move back even if I got a free machine with sugar on top.

      Comment


      • #23
        While M based Macs are indeed very good machines for desktop and certain work, lack of external GPU support or upgradeable RAM (not to mention lack of ECC) is no go for many tasks. Compare that to Ampere Altra Developer Platform that comes with external GPU support and DDR4 RAM slots that supports up to 768 GB of RAM.

        Comment


        • #24
          Originally posted by avis View Post
          "Cores" for the AI engine and the GPU make zero sense as they are inherently parallel. I don't understand why Apple's marketing insists on them. They could and should use teraflops/iops/whatever instead.
          You say this, but to this day AMD and Nvidia both use "cores" to market their GPUs. CUDA Cores ring a bell?

          Comment


          • #25
            Originally posted by Jumbotron View Post
            Apple proves once again that it is the most sophisticated CPU company in the world surpassing Intel, AMD and Nvidia. No other company in the world is capable of producing a CPU with a desktop performant integrated GPU as well as the world’s most performant NPU on a consumer device all tied together with a zero copy heterogeneous memory protocol. All on the smallest node with the best compute power to power draw efficiency of any platform from the Big 3. The only other company even close to being capable of this feat is Qualcomm and that’s only because they bought Nuvia three years ago and Nuvia was founded by ex-Apple Silicon engineers and designers.

            And when I say the Big 3 are not capable of producing a chip as capable and on such a small node and with such power efficiency that it needs no fan I really mean that Intel and AMD and Nvidia are technically incapable of producing a chip like Apple Silicon’s M4.
            This guy drank the Apple Kool-Aid. Keep in mind that Asahi Linux still doesn't work on Apple's M3's, let alone M4's. Unlike the "big 3" Apple doesn't provide any Linux support. Qualcomm was caught cheating as manufacturers were complaining they were getting half the performance that Qualcomm was stating with their Snapdragon X Elite. Last year Apple Mac sales dropped 34% year over year, though it did grow 1% since. StatCounter shows that MacOS dropped by 6% since November of 2023, which is the same source that showed that Linux reached 4% last month. Apple paid for the 3nm manufacturing from TSMC, which is the only reason why they have it. As you can see from the video bellow, both AMD and Intel are nearly as power efficient as Apple's M3's. AMD in particular is drastically faster than Apple's M3 in multicore performance. AMD in some cases can last longer on battery compared to the M3 in video playback. More importantly, AMD's Ryzen 7-8840HS will fully work on Linux, right now. As for not needing a fan, you're insane. The M3 Macbook Air was reported to hit 114C with 3DMark Wild Life Extreme, at the cost of performance. These chips clearly don't effectively work fanless, unless you don't need to use these devices for gaming or productivity.

            It's 2024 and Apple's switch to ARM was a mistake. AMD had already matched Apple in power efficiency while Intel is nearly there. Both AMD's Dragon Range and Intel's Meteor Lake will last for several hours with mild use, while also retaining full performance while unplugged.

            Comment


            • #26
              Originally posted by waxhead View Post
              I am not exactly a fan of Apple but they get one thing right with the M series CPUs. A larger number is supposed to be better. Simple and easy to follow unlike the mess x86 cpus (and sockets) are with all the skylake, coffee lake, loch Ness lake and god knows what lake. AMD is not bad with their zen stuff really, but there too it is all to many models to keep track of for simple consumers.
              Not sure about the M4, but M3 and M3 Pro offer less performance cores and more efficiency cores compared to the M2's. The M3 Max and Ultra though do offer the same amount of cores as their predecessor. Also the bandwidth of the M3's are lower than the M2's. If you think this is the only time Apple did this, then there's the M2 and M2 Pro with base SSD has half the performance compared to the M1's, because Apple literally put half the NAND chips. With Apple, nothing is ever bigger numbers. They clearly want consumers to pay for more SSD and faster CPU's, by crippling their base and pro models. The only time Apple didn't do this was the M1's.

              Comment


              • #27
                Originally posted by sophisticles View Post

                Same thing for video work, the M powered Macs are unbeatable for certain types of video work, specifically where ProRes is involved.
                ProRes is Apple's software and they never ported it to anything but Apple devices. You have alternatives on Windows though I don't know about Linux. Anything Apple can do, you can do more on PC and better, at 1/4 the price.

                Comment


                • #28
                  Originally posted by ezst036 View Post

                  Link?
                  It was all pretty much confirmed back in 2023 and in dev circles. The next Windows OS coming out in 2025 is going big on the ARM architecture and Nvidia is looking to capitalize on that by pushing forward with their failed Tegra program and shaping the remains of it into these new devices. AMD has contracted a dev relationship with Samsung, where they are exchanging and cross-licensing patents and such, for these new devices, with Samsung getting access to AMD's GPU tech for their use and AMD getting ARM CPU and device design expertise.

                  Comment


                  • #29
                    Originally posted by Dukenukemx View Post
                    ProRes is Apple's software and they never ported it to anything but Apple devices. You have alternatives on Windows though I don't know about Linux. Anything Apple can do, you can do more on PC and better, at 1/4 the price.
                    Alternatives mean crap.

                    The amount of polish and intuitiveness on a software tools means a lot more than just having the same functions or features on an alternative.

                    Premiere and Vegas look, feel and handle jobs like a clumsy baboon trying to fuck its mate compared to Final Cut Pro. And in turn, the likes of Davinci, Openshot, Kdenlive, Cinelerra etc, look, feel, function, perform and handle jobs like a clumsy baboon trying to fuck its mate compared to the likes of Premiere.

                    Comment


                    • #30
                      Originally posted by Volta View Post

                      With such terrible UX and performance? Good joke. Maybe only for some graphical applications that aren't available under Linux. Otherwise there's nothing interesting there.
                      So run linux on it. Problem solved.

                      Comment

                      Working...
                      X