Announcement

Collapse
No announcement yet.

Apple Releases M1-Powered Apple Silicon Macs, macOS Big Sur Releasing This Week

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by Jumbotron View Post

    Please reread my posts VERY carefully once gain. Time after time I have said on this thread alone that big metal x86 is still relevant for HPC and Supers. I have said this for months on this very site. But for consumer gear, x86 is on the way out. That arch can NEVER be made to achieve IPC per watt or even raw efficiency. They've had 50 years to do it and both Intel and AMD have bowed out of the mobile market.

    And now Apple is catching up in raw compute power well along with heterogeneous compute. Intel has NEVER been able to do this. AMD did for a while with the HSA APUs but now has abandoned HSA. Apple is the ONLY CPU manufacturer to have a fully HSA SoC. On top of that fully HSA SoC with 8 CPU cores, 8 GPU cores, a an on die 16 core dedcated Neural Net Processor, and on package LPDDR4x RAM. All at 5nm.

    AMD doesn't have anything near this. They won't even have something like this until possibly the ZEN 4 / Infinity Architecture based APUs somewhere in 2023 at best. And they may not have a 5nm product by 2023 either.

    Intel doesn't have anything like this. Hell....they won't even have an advanced matrix extension to their AVX set until the end of 2021 and only in their Xeon server chips. And I bet it won't even be 10nm.
    While I personally dislike a future where home computers will have those lock-in Operating Systems like Apple, i have to agree with you that specialized hardware is the way to go to increase performance. And it's good to have these opinios around here to create debate.

    On a different perspective, working with virtualization I see the same scenario where, you achieve best performance while buying hardware that has the proper offloading features for data, no matter if storage or network. Network offloads(lro, tso, vxlan/nvgre/geneve offload), SRV-IO, RoCEv2 support, iscsi and FCoE offload for legacy compatibility/migration, a storage that is "pure nvme" on the contrary of some shitty players that still use SAS backend on their "nvme storages"(yes, Dell compellent uses this horrible architecture where your nvmes are acessed by limited SAS backend connection, probably to take more money of you on 2 to 5 year upgrades). All this and other features wrapped up make a HUGE performance difference.

    Same applies here. Having an OS that will use all those fpga/hsa hardware will make a difference on performance per watt.

    Comment


    • Hmm...After coming into work today I thought about the M1 processor since I got a question of if I wanted it for work. And my answer is actually no.

      This since the workloads I have in my work (and probably 80-90% of all developers these days) are heavily multithreaded. For instance, what did I do now in the morning on my local machine:

      1. Spun up 2 k8 clusters in different vm:s configured network between them
      2. Run several system-test suites on the outfacing cluster
      3. Trying to pinpoint where stuff go wrong.

      My 16" macbook pro coould barely handle this, allthough all cores are working on max.
      And if you wonder why I do this locally, is just because I think it is faster do it locally and I have the full control of the environment and dont have to deal with idiots trying to use my environment for something else.

      And this M1 processor...is only 8 core...or more 4+4 core processor. I would not be able to handle these things at all. I would rather have a ryzen 4800U in a macbook. That I would have bought instantly.

      But maybe m2 or maybe 3 will have the cores needed for real work.

      Comment


      • Originally posted by timofonic View Post
        Please ban Jumbotron. ASAP. He's a troll.

        It's not because of his opinions, that's irrelevant.

        It's because he spams articles from other sites all time and repeat them at nearly each reply.

        And because he abuses of upper case letters. He also uses altright style of writing, to say it very lightly and not feed the troll so much.

        I don't care of fanboy or nonsense opinions as long as they are respectful and not considered spam.

        Michael, please consider it a lot.
        No more a troll than the rabid anti-Apple folks who spew nonsense when they haven't even read the data and then repeatedly spew the same nonsense over and over. Each time someone has, I have provided a data point from a known confirmed source.

        That's not trolling. Those are facts. Apple sucks is not a fact. It is not data. It's baseless fan boyism....or should i say....baseless anti-fan boyism.

        My opinions are sometimes opinionated ...yes. But only in response and in proportion to opinionated bullshit that has no basis in fact or data.

        The very fact that I am sparring with folks who are full of shit actually means that I don't want them banned. This is an arena of ideas and opinion. Bring it. But don't cry and melt like a little snowflake and call for the ban hammer because you don't like what someone is saying. Bring your facts, bring your data.

        As I have said many times before, i have never, ever owned one Apple product in my life. Perhaps I never will. If my computing needs call for it then I will but certainly not because of fashion. This is NOT so much about Apple as it is ARM arch's takeover first of mobile and now the inevitable takeover of the desktop. ARM is already powering the world's fastest Supercomputer not only in terms of raw compute but it is the highest perf per watt in the world.

        If you were truly informed, like...been paying attention for the last 20 years....this sea change in tech should come as no suprise. x86 was destined to hit a technical roadblock in Moore's Law, physics, and uses cases because in large part due to it's 40 legacy. As i have also said, x86 will still be relevant for some time in a narrower scope, namely HPC and Supercomputers.

        But the era of x86 is indeed waning. And that will accelerate over this decade. The fact that you are shocked and outraged, particularly since the era of x86 is coming to an end is being spearheaded by Apple means YOU haven't been paying attention all these years. That's on you. And if you spout uninformed bullshit, well....I'm going to call you on it.

        Comment


        • Originally posted by vein View Post
          Hmm...After coming into work today I thought about the M1 processor since I got a question of if I wanted it for work. And my answer is actually no.

          This since the workloads I have in my work (and probably 80-90% of all developers these days) are heavily multithreaded. For instance, what did I do now in the morning on my local machine:

          1. Spun up 2 k8 clusters in different vm:s configured network between them
          2. Run several system-test suites on the outfacing cluster
          3. Trying to pinpoint where stuff go wrong.

          My 16" macbook pro coould barely handle this, allthough all cores are working on max.
          And if you wonder why I do this locally, is just because I think it is faster do it locally and I have the full control of the environment and dont have to deal with idiots trying to use my environment for something else.

          And this M1 processor...is only 8 core...or more 4+4 core processor. I would not be able to handle these things at all. I would rather have a ryzen 4800U in a macbook. That I would have bought instantly.

          But maybe m2 or maybe 3 will have the cores needed for real work.
          Here is the basic x86 mindset. Only MY work is REAL compute work.

          No....not by a long shot. Did Apple make the M1 to be a server chip? No. Yet...that is what is in most x86 consumer computers right now. De-tuned server chips. x86 was never designed and is STILL not designed for consumer products. They start as Server designs and then get detuned to meet lots of other markets like the consumer market. That is a losing game.

          We have already seen that with both Intel and AMD exiting the mobile market, the switch market, the embedded market and most of the storage market. x86 can never ever be made into a viable mobile chip offering.

          But....because ARM started out as an intrinsically power sipping design, based on RISC as opposed to CISC, they have an advantage whereby it is much easier to ADD to that design to meet more power compute task WHILE STILL meeting thermal and package constraints.

          Plus the flexibility in design and in it being RISC means it can break out of the SoC paradigm and go BIG and go DISCREET much easier than x86.

          This is why you are seeing ARM scale down, up and out to meet WAY more compute tasks on FAR MORE a variety of compute platforms than x86 has....or ever could.

          You're not going to see x86 in earbuds.
          You're not going to see x86 in smartspeakers
          You're not going to see x86 in smartwatches
          You're not going to see x86 in switches or routers
          You're not going to see x86 in tablets
          You're not going to see x86 in Smart Phones
          You're not going to see x86 in Smart TVs
          You're not going to see x86 in Cellular base stations
          You're not going to see x86 in SBCs
          You're not going to see x86 in Robots
          You're not going to see x86 in Space Probes and Satellites
          You're not going to see x86 in ANY Apple product going forward

          You ARE going to see a DECLINE in x86 in Chromebooks
          You ARE going to see a DECLINE in x86 in Microsoft products
          You ARE going to see a DECLINE in x86 Windows based products from HP, Dell, Lenovo, Acer and ASUS
          You ARE going to see an INCREASE in the number of HPC, SERVER and SUPERCOMPUTERS using ARM and even beating x86 not only in Perf per Watt but raw compute as well.
          You ARE going to see an INCREASE in the number of University and even Continental HPC and Supercomputing initiatives that use ARM exclusively.

          x86 is on the wane in the Consumer world. x86 is fast becoming a niche product. Because it always was. It's always been a server tech, that was shoehorned down to the consumer space. That 50 year legacy is now showing itself to be obvious.

          For the VAST, VAST majority of people's daily compute tasks, there is absolutely no need for Server tech to be in their computing platform. In other words...no need for x86

          I'm glad that x86 still serves you. But your work needs are the >1% of the daily compute needs for the rest of humanity. And even then...by the end of this decade, there will be an ARM chip or platform that you will find compelling enough to use even for the work you described above.

          Even before then....you may want to jump ship as most folks as this decade progresses and certainly now after Apple's rollout of desktop ARMs will begin to program first for ARM and then x86.


          Comment


          • I've been hearing this "ARM will kill x86" since ~2010. However, I don't see any progress regarding this whatsoever. ARM failed to gain any significant share in the datacenter and HPC. ARM also failed to make any considerable traction in Windows ecosystem too (at least up until this moment). x86 vendors are declaring record profits for several consequent years now. I thing people mischaracterizes the situation a bit. What is actually being killed by ARM ecosystem is a "classic PC desktop", which is a tool now mostly used by professionals and gamers. It's not the main consumer computational device as it used to be. And it's OK. However, I don't see x86 going down anytime soon. Of course that ARM or RISCV (more likely) can very kill x86, but we are talking decades here.
            Last edited by drakonas777; 11 November 2020, 01:55 PM.

            Comment


            • This particular forum thread seems to be the very definition of "echo chamber"

              Comment


              • Originally posted by kpedersen View Post
                In ~5 years Apple is going to turn around and say its silicon was unsuccessful because everyone *needs* Intel in this "modern" day and age.

                In fact this is not true. People don't need Intel but they also need their architecture to not be a locked down piece of shite. Unfortunately the guys making the products wont understand this and we will have another 20 years of Intel-only crap.

                We saw this a little bit with Windows RT and dumb stuff like this is the reason our industry is stagnating a fair chunk. So many people blame ARM chips for the reason for not being able to run their software, when in actual fact compiling for an ARM target is trivial and it is the artificial restrictions added by Microsoft which effectively castrated their own users.

                Apple is basically boring and is getting in the way of "modern" computing.
                Wait, are you saying manufacturers can't lock down bootloaders or BIOS's on Intel or AMD hardware?

                I do agree with your point, however.
                Last edited by Vistaus; 11 November 2020, 02:50 PM.

                Comment


                • Originally posted by andre30correia View Post
                  intel atom macbook performance
                  Anandtech compared the A14 and extrapolated:
                  "Apple claims the M1 to be the fastest CPU in the world. Given our data on the A14, beating all of Intel’s designs, and just falling short of AMD’s newest Zen3 chips – a higher clocked Firestorm above 3GHz, the 50% larger L2 cache, and an unleashed TDP, we can certainly believe Apple and the M1 to be able to achieve that claim."

                  Certainly not an atom class.
                  I guess we will find out in a week or two when the benchmarks come in.
                  Last edited by system32; 11 November 2020, 02:50 PM.

                  Comment


                  • Originally posted by Setif View Post
                    iOS developers need macos, so they have to buy MacBooks.
                    You can quite easily run macOS as a virtual machine. Unless it's a big app that requires the best performance to compile and build.
                    Last edited by Vistaus; 11 November 2020, 02:49 PM.

                    Comment


                    • Originally posted by kpedersen View Post

                      Perhaps the most immediate reason is going to be 3rd party software support. Photoshop plugins, Maya plugins, etc. If developers don't play ball because Apple makes the platform too much of a consumer shite-show, they simply will not get the software and the userbase will fall as a result.
                      In the long run, yes. But for now, Apple does provide Rosetta for running x86_64 binaries as a stop-gap.

                      Comment

                      Working...
                      X