Announcement

Collapse
No announcement yet.

More Progress Is Made Understanding Apple's M1 GPU, Working Towards An Open Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Still great news. While I don't particularly care for Apple, I must admit that their M1 chips are impressive. With rumors about M1X / M2 and so on, and computers with more ports and such, it's an interesting option. Sure I'd love to see other ARM-based laptops with similar performance, especially for emulating x86 software which we probably will need to for a while longer.. But as it stands, it seems to be 1-2 years before we even have something competitive on the market. Would consider an M1 if/when the hardware is supported, so great to see some progress.

    Comment


    • #12
      Originally posted by tildearrow View Post

      May an Ampere Altra 8-core mini-sized computer exist.
      I'm more hopeful for RISC-V myself.

      Comment


      • #13
        I find it baffling that there are always some people who get really upset because they think that Apple intentionally breaks OSS support or cripples OpenGL/Vulkan performance when the easiest explanation is that they simply do not care about OSS or whether something outside of their ecosystem breaks.
        Regarding the lacking hardware features: In the end, the goal of all GPUs is the same so they certainly provide some way to do it in an efficient fashion. The question is not whether or not it can run stuff people do with Vulkan. Vulkan is just a way of telling the GPU what to do. Same goes for Metal. The question is how easy it is to translate the intention phrased in Vulkan to something the M1 understands.

        Comment


        • #14
          Beyond the half exposed apu, I don't know how they are going to get through the serialized encrypted components of that motherboard. Even repair people are having a hard time with it.

          Comment


          • #15
            Originally posted by phoronix View Post
            For all the visible hardware features, it’s equally important to consider what hardware features are absent. Intriguingly, the GPU lacks some fixed-function graphics hardware ubiquitous among competitors. For example, I have not encountered hardware for reading vertex attributes or uniform buffer objects.
            What affect will these missing features have on potential gaming performance?

            Comment


            • #16
              Originally posted by ezst036 View Post
              What affect will these missing features have on potential gaming performance?
              As for reading vertex attributes - not any huge performance difference. The driver could insert instructions for reading vertex data into the vertex shader. This way, it will emulate the hardware. If I am not mistaken, Mantle also did not support special hardware to load vertex attributes, probably reflecting AMD not having dedicated hardware for this that time. But Nvidia has special hardware to load vertex attributes. Not sure about AMD these days.

              Comment


              • #17
                Originally posted by GruenSein View Post
                I find it baffling that there are always some people who get really upset because they think that Apple intentionally breaks OSS support or cripples OpenGL/Vulkan performance when the easiest explanation is that they simply do not care about OSS or whether something outside of their ecosystem breaks.
                Their "ecosystem" is their invention and they designed it the way it is exactly to keep others outside, it's not only casual lack of support for this or that OSS. Their "ecosystem" includes many "features" that are at best useless to the end user, but they're there nevertheless because they help keeping the thing hard or impossible to hack (in the positive meaning of the term).
                The M1 is not such a beast of performance or whatever when you consider its price (and when you run something on it other than just Apple benchmarks) and compare it with other offerings at the same pricetag, so why bother making your own hardware when there already are equivalent or better choices out there?
                Because that's the only way to keep the ecosystem closed enough.

                Take for example old Windows-only GDI printers: it was a whole different story. In that case the goal was to make a printer cost less than its ink cartridge, so almost no RAM, almost no CPU, almost anything onboard: just the bare minimum to talk to a Windows-only GDI driver. It took reverse engineering to have those paperweights somewhat working outside of Windows, but the hardware maker didn't get in the way on purpose, they just targeted a specific market that wasn't Linux users in order to cut production costs.

                Apple is different. They think different, you know. They think how to keep their users from using anything non-Apple.

                Comment


                • #18
                  Originally posted by lucrus View Post

                  Their "ecosystem" is their invention and they designed it the way it is exactly to keep others outside, it's not only casual lack of support for this or that OSS. Their "ecosystem" includes many "features" that are at best useless to the end user, but they're there nevertheless because they help keeping the thing hard or impossible to hack (in the positive meaning of the term).
                  The M1 is not such a beast of performance or whatever when you consider its price (and when you run something on it other than just Apple benchmarks) and compare it with other offerings at the same pricetag, so why bother making your own hardware when there already are equivalent or better choices out there?
                  Because that's the only way to keep the ecosystem closed enough.

                  Apple is different. They think different, you know. They think how to keep their users from using anything non-Apple.
                  Many reviews disagree with your assessment of the M1 and find it extremely fast and versatile considering its power figures. Since Apple has always focused heavily on thin and light devices, it provides many benefits over competing solutions. And even if you think that other solutions are at least on par or better, there is one more obvious reason to have an in-house solution: Keeping profits in-house. Why should Apple (or anyone for that matter) use third party products if they find that they can achieve adequate performance without anyone else taking a cut of the final profits? Samsung has been making subpar Exynos SoCs instead of simply buying Qualcomm for years for exactly this reason. It had nothing to do with "keeping their devices locked in" or whatever which is made fairly obvious since they also used Qualcomm SoCs. All I am saying is: If you think that Apple spends much time wondering what Linux users think, you are way off. Designing an entire (industry leading) SoC just so a few people cannot run Linux on their stuff anymore doesn't seem reasonable because hardly anyone buying their stuff even wants to do that. Linux users are such a tiny fraction in Apple's key customer base (home users) that they simply do not care whether or not you get Linux to run on it. Not caring also means that they build an SoC which does exactly what they want to offer to the customer (Metal accelerated software) without including stuff that might make writing Vulkan drivers easier. They won't help with alternative uses of their hardware because that help does not generate profits. However, that is different from actually hindering such efforts.
                  Last edited by GruenSein; 19 April 2021, 10:43 AM.

                  Comment


                  • #19
                    Originally posted by GruenSein View Post
                    I find it baffling that there are always some people who get really upset because they think that Apple intentionally breaks OSS support or cripples OpenGL/Vulkan performance when the easiest explanation is that they simply do not care about OSS or whether something outside of their ecosystem breaks.
                    Regarding the lacking hardware features: In the end, the goal of all GPUs is the same so they certainly provide some way to do it in an efficient fashion. The question is not whether or not it can run stuff people do with Vulkan. Vulkan is just a way of telling the GPU what to do. Same goes for Metal. The question is how easy it is to translate the intention phrased in Vulkan to something the M1 understands.
                    There isn't a huge conceptual difference between Metal and Vulkan either.

                    As for Apples ARM based hardware I don't see them breaking stuff on purpose, I see them breaking stuff to reach goals. The complete deletion of 32 bit support is a perfect example. This allowed them to achieve goals that likely included elimination of a lot of circuitry related to 32 bit support with th end goal much lower power usage. This shouldn't surprise anybody because they where telegraphing the elimination of 32 bit hardware for years at WWDC. In the same manner they pushed developers really hard to adopt API's knowing that underlying hardware would change. They actually said so in at least a few WWDC videos. So anybody complaining about changing hardware from Apple simply hasn't been paying attention or simply can't digest the information being feed to them.

                    Apple does a lot of disgusting things but frankly they haven't had a problem when it comes to hardware. They literally told developers that hardware will be changing.

                    Comment


                    • #20
                      Originally posted by stingray454 View Post
                      Still great news. While I don't particularly care for Apple, I must admit that their M1 chips are impressive. With rumors about M1X / M2 and so on, and computers with more ports and such, it's an interesting option. Sure I'd love to see other ARM-based laptops with similar performance, especially for emulating x86 software which we probably will need to for a while longer.. But as it stands, it seems to be 1-2 years before we even have something competitive on the market. Would consider an M1 if/when the hardware is supported, so great to see some progress.
                      More ports? But everyone, even Apple fans, is always saying that Apple won't add any more ports to their products…

                      Comment

                      Working...
                      X