Announcement

Collapse
No announcement yet.

Apple M1 Open-Source GPU Bring-Up Sees An Early Triangle

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    That is pretty promising. At least she got the submitting request to GPU and other stuff.

    It is sooner than I expected.

    It will take a lot of time to be poking at the macos/metal shader compiler to reverse engineer the ISA. But we will get there. It looks more like late 2020, compared to my early estimates of middle of 2021 ;D

    Comment


    • #12
      Triangles? What a joke! Please call me when squares and rectangles are ready. And rombs, nobody thinks on them :'(

      Poor geometric figures, they need our help.

      Comment


      • #13
        Originally posted by rmfx View Post
        Good luck because that will be even more undocumented than Nvidia hardware.
        The problem is not the missing documentation (people do reverse engineering), but the missing firmware. As I understand, Apple loads the firmware for all the chips inside the SoC, including iGPU, before loading any OS (so no work for people making Linux to work).

        Comment


        • #14
          Originally posted by timofonic View Post
          Triangles? What a joke!
          I am going to presume you are just kidding, as triangles are quite typically a (sometimes only) primitive in GPUs. Once you can render a triangle you can render the world.

          Comment


          • #15
            Originally posted by CommunityMember View Post

            And some people climb mountains just because they can. More (GPU?) power to them if they want, and can, do so.
            I'd rather think: what Alyssa is doing, is an act of liberation. Remember the movie *Free Willy* ...

            Comment


            • #16
              Originally posted by CommunityMember View Post

              I am going to presume you are just kidding, as triangles are quite typically a (sometimes only) primitive in GPUs. Once you can render a triangle you can render the world.
              Please read my entire post. I was joking all time about geometric! 🤣

              Comment


              • #17
                Originally posted by Alexmitter View Post

                It performs about exactly as a modern low power x86 chip is expected to perform on 5nm. Not worse, not better.
                While taking so little power that its passively cooled?

                I think you are wrong on that one.

                Comment


                • #18
                  Originally posted by Slartifartblast View Post
                  All this razzle dazzle over the M1, let's see what happens when AMD get their hands on 5nm EUV then we can judge just how much it's down to the process and how much is an Apple reality distortion field. Good luck to their effort, I wouldn't piss on Apple if it were on fire to help put it out.

                  When AMD will introduce 5 nm CPUs, those will certainly be faster than whatever CPUs Apple will have by then, but with the price of a much higher power consumption, exactly like it is today when comparing desktop Zen 3 CPUs with Apple M1.

                  Despite their misleading claims during the M1 launch about Apple CPUs being the fastest, Apple will never make the fastest CPUs, because they would gain nothing by doing that.

                  Apple could have easily made a CPU much faster than M1 and much faster that any Intel/AMD, by just designing a larger chip with more cores.

                  However that would have meant having much larger manufacturing costs and a requirement for larger and more expensive cooling systems, both of which could only diminish Apple's profits without bringing them any new customer.

                  Apple CPUs consume less power at a given performance because they achieve that performance at a much lower clock frequency (two thirds) than Intel/AMD.

                  Intel Alder Lake and AMD Zen 4 might achieve an increase in instructions per clock of 20% over Tiger Lake and Zen 3, but that will not be enough to match the IPC of Apple M1, much less the IPC of its successor, so they will still have a lower energy efficiency, even if the top models will be faster than Apple's.


                  Only around 2023 it is unpredictable which CPU will be the fastest and which will have the highest IPC, because there are no public details of the next generation projects of Intel and AMD.

                  Until recently it was believed that achieving a much higher IPC than in current CPUs will cost too much, so the development roadmaps of both Intel and AMD had relatively modest targets of increasing the IPC by only around 20% in each generation, e.g. Skylake => Ice Lake => Alder Lake or Zen 1 => Zen 2 => Zen 3.

                  Now, after Apple has demonstrated that higher increases in IPC are possible at a reasonable cost, it is likely that both Intel and AMD have modified their design goals to be more ambitious, in order to catch up with Apple, but a couple of years might pass until a result will be seen.


                  Apple's technical achievement is impressive, but, unfortunately, except for their captive loyal customers, this achievement is worthless.

                  Unlike traditional computer companies, Apple does not publish anything about their processors. In the past, one could learn a lot from the articles published by IBM, Intel, AMD and many other companies that are less important today. No company publishes today as many technical details as they were publishing 10 years ago and far less than they were publishing 20 years ago. Nevertheless, they still publish information about the results of their research, while Apple does not publish anything useful. Whatever Apple might have discovered, they keep that jealously for themselves.

                  Moreover, Apple, despite contrary claims, does not sell any computer. Any Apple computer is not the property of its buyer, because Apple continues to be able to make decisions remotely about how the Apple computer may be used, e.g. whether to allow or not some programs to run. While I was satisfied with an Apple laptop that I had many years ago, when they did not have yet the restrictions of today, I will not buy again an Apple computer, because I use only computers that I own, i.e. which do only exactly what I tell them to do and nothing else.
                  Last edited by AdrianBc; 23 January 2021, 06:59 AM.

                  Comment


                  • #19
                    Originally posted by Slartifartblast View Post

                    As there isn't a 5nm EUV TSMC x86 available that's just a weather prediction 30 days in advance, take with a pinch of salt.
                    it looks like AMD does not need 5nm to be faster... just watch the benchmarks of the 5900HX it destroys the complete competition.

                    and yes it is "faster" than the apple m1 maybe not "faster per watt" but at a higher power consumsion it is faster.

                    https://www.notebookcheck.com/filead...dfd54a22e3.jpg
                    Phantom circuit Sequence Reducer Dyslexia

                    Comment


                    • #20
                      I would rather effort be directed to making a good gpu driver for PowerVR instead as those are in quite a lot of devices unlike the m1 chip.

                      Comment

                      Working...
                      X