Announcement

Collapse
No announcement yet.

Latest Mesa AGX Work Points To More Apple M1/M2 Similarities With PowerVR Graphics

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Dukenukemx View Post
    I made that prediction about Apple's GPU's. Turns out it took nearly a month to find out.
    Phoronix: Asahi Linux May Pursue Writing Apple Silicon GPU Driver In Rust When it comes to the Apple M1 and M2 support on Linux, one of the biggest obstacles to suitable daily use for end-users is the current lack of GPU acceleration. Reverse engineering has been happening for the Apple Silicon graphics processor, early


    Developer12 was wrong.
    https://www.phoronix.com/forums/foru...e2#post1340143
    eh. if you read my comment it was based on the fact they weren't using anything from the open powerVR driver. turns out it just took them a long time to look at it.

    Comment


    • #12
      Originally posted by coder View Post
      Not likely. If the hardware works in a fundamentally different way, the API needs to reflect that for the sake of efficiency (not to mention reducing complexity). And Apple cares about nothing more than efficiency.
      this is true and it isn't.

      sure, things can always technically be as efficient as possible, but few things are created from a clean sheet. as noted in the initial post that lead to this article, powerVR and AG use completely different instruction sets. merely the flow of data is the same. as boland mentions above, it's also possible that the cores underneath have changed a lot as well.

      what may well be the case is that apple have kept the overall data flow the same so that they could continue to use and adapt the same driver code they had been using, while swapping out the instruction set and overhauling the microarchitecture underneath. this would be much the same as their continued use of the arm architecture for their CPUs while inventing a new (firestorm) microarchitecture to replace whatever ARM gave them.

      x86 is especially an example of this, implementing instructions and registers from a chip that was made in the 1980's yet also implementing all kinds of pipelining, speculation, branch prediction, and prefetching optimizations beneath them. it's not easy, but when you've made an investment in your tooling it's sometimes the only way.

      Comment


      • #13
        Originally posted by Boland View Post
        But from what Andrei hinted at before he left Anandtech, the actual cores in Apples GPUs diverged a lot from the powerVR designs.
        I'd imagine that would be the case since I'm sure Apple is trying to avoid actually becoming a PowerVR GPU. Also, I remember hearing that Apple's GPU's lack features that are needed in Vulkan but aren't in Metal. That could also be the difference.

        Comment


        • #14
          Originally posted by Dukenukemx View Post
          I'd imagine that would be the case since I'm sure Apple is trying to avoid actually becoming a PowerVR GPU. Also, I remember hearing that Apple's GPU's lack features that are needed in Vulkan but aren't in Metal. That could also be the difference.
          If you read the various posts by those unraveling the inner working of apple's GPUs, you'll know some of the reasons for the split. Not being powerVR isn't one of them. it doesn't really matter, being as apple has already licenced the technology.

          Apple's M1 GPU isn't for phones. It's significantly closer to what you'd find in an AMD or nvidia or intel graphics card. They've had to make huge changes to it's architecture to meet their performance targets. It's similar to the way they threw out the tiny, power-sipping ARM cores they got from ARM holdings in favor of their significantly wider and more capable Firestorm microarchitecture that they developed.

          At this point it's hard to say if the GPU is "lacking features" or if they simply aren't being used by the metal driver. There have been multiple discoveries of additional functionality thought not to previously exist. At the same time, a lot of functionality, particularly in openGL, can be made up with programmable shaders.

          Comment


          • #15
            Originally posted by Developer12 View Post
            It's similar to the way they threw out the tiny, power-sipping ARM cores they got from ARM holdings in favor of their significantly wider and more capable Firestorm microarchitecture that they developed.
            When is the last time Apple ever used an off-the-shelf core from ARM? Jim Keller was working at Apple during development of the A4 and A5. He's a CPU architect and it stands to reason Apple paid $278M to acquire PA-Semi, back in 2008 - because they wanted a top notch team specializing in low-power CPU design.

            Your post makes it sound like the A14 is when Apple suddenly decided to swap in their own cores.

            Comment


            • #16
              Originally posted by coder View Post
              When is the last time Apple ever used an off-the-shelf core from ARM? Jim Keller was working at Apple during development of the A4 and A5. He's a CPU architect and it stands to reason Apple paid $278M to acquire PA-Semi, back in 2008 - because they wanted a top notch team specializing in low-power CPU design.

              Your post makes it sound like the A14 is when Apple suddenly decided to swap in their own cores.
              I'm simplifying, and making an analogy. That is more what's happened on the GPU side, with the transition from 100% PowerVR GPUs to the AGX.
              Last edited by Developer12; 08 September 2022, 03:34 PM.

              Comment


              • #17
                its good news for the linux community thankfully to the powervr similarties in the apple m1/m2 gpu we maybe get linux drivers faster. all the people talking bad about apple m1/m2 because apple does not support directly will maybe soon be very surprised.
                next step after the drivers are implemented and upstream is maybe apple enters the linux server market for low-power servers.
                apple could also team up with redhat und enter the linux workstation market.

                there are many possibities and this all is bad news for the Wintel+Nvidia Cartell.. Microsoft+Intel+Nvidia are going down.

                and i am very happy about this.
                Phantom circuit Sequence Reducer Dyslexia

                Comment


                • #18
                  Originally posted by qarium View Post
                  its good news for the linux community thankfully to the powervr similarties in the apple m1/m2 gpu we maybe get linux drivers faster. all the people talking bad about apple m1/m2 because apple does not support directly will maybe soon be very surprised.
                  We're two years in with M1 and we still don't have Linux GPU drives for it. Intel's ARC which is released this year (kinda), does have working GPU drivers for Linux.
                  next step after the drivers are implemented and upstream is maybe apple enters the linux server market for low-power servers.
                  apple could also team up with redhat und enter the linux workstation market.
                  I'll put this down next to NFT's being successful.
                  there are many possibities and this all is bad news for the Wintel+Nvidia Cartell.. Microsoft+Intel+Nvidia are going down.

                  and i am very happy about this.
                  Apple alone isn't a cartel but Microsoft+Intel+Nvidia+AMD is a cartel? What about Google?

                  Comment


                  • #19
                    Originally posted by Dukenukemx View Post
                    We're two years in with M1 and we still don't have Linux GPU drives for it. Intel's ARC which is released this year (kinda), does have working GPU drivers for Linux.
                    really man you are free to buy an intel ARC GPU... LOL... but something tells me you will not do this because intel ARC is a complete failure.

                    Originally posted by Dukenukemx View Post
                    Apple alone isn't a cartel but Microsoft+Intel+Nvidia+AMD is a cartel? What about Google?
                    well Google is a monopole thats true because google has high "marketshare"

                    apple can not be a monopole because of low "marketshare"

                    apple also can not be a cartel because cartel means more than 1 company...

                    "Microsoft+Intel+Nvidia+AMD is a cartel?"

                    this my friend is complete bullshit because AMD is not part of this group.

                    Microsoft+Intel+Nvidia is the Wintel cartell

                    with:
                    microsoft has the win32 API monopole
                    intel has the x86 ISA war monopole
                    Nvidia has the CUDA/optix and many other stuff monopole.

                    Phantom circuit Sequence Reducer Dyslexia

                    Comment


                    • #20
                      Originally posted by qarium View Post
                      really man you are free to buy an intel ARC GPU... LOL... but something tells me you will not do this because intel ARC is a complete failure.
                      But you have to admit that you were wrong about it working only with Intel CPUs.

                      Comment

                      Working...
                      X