Announcement

Collapse
No announcement yet.

More Progress Is Made Understanding Apple's M1 GPU, Working Towards An Open Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • paradroid
    replied
    I wouldn't wonder Apple designed the GPU as much as possible around patents.

    Leave a comment:


  • wizard69
    replied
    Originally posted by Vistaus View Post

    More ports? But everyone, even Apple fans, is always saying that Apple won't add any more ports to their products…
    Actually they did add more ports and frankly that is why I look at my M1 Air as a better option, amongst a number of things, than a tablet. I think it is fair to say though that on Apples larger platforms the Apple fan base has been real hard on Apple with respect to ports. Almost everyone wants apple to correct their stupidity with the port hardware we have on the so called "pro" machines.

    Until the advent of M1 Apples Mac line has been pretty stagnant to the point I had to wonder if they where trying to kill the entire line off. So the hope is that M1 and the processor moving forward will lead to a rapid advancement of the hardware offering and frankly far more innovation. For example why does an M1 computer have to be any thing more substantial than a Keyboard? Or why not make a Mini with a real discreet GPU that doesn't suck. The thermal envelope of these processors just opens up a massive number of possibilities with new design.

    Dave

    Leave a comment:


  • wizard69
    replied
    Originally posted by lucrus View Post

    Their "ecosystem" is their invention and they designed it the way it is exactly to keep others outside, it's not only casual lack of support for this or that OSS. Their "ecosystem" includes many "features" that are at best useless to the end user, but they're there nevertheless because they help keeping the thing hard or impossible to hack (in the positive meaning of the term).
    Good or bad doesn't make a lot of difference here, Apple believes making the hardware / software difficult to hack is a good thing for both Apple and its customers. Frankly this is the same attitude the Linux kernel developers have with their stress on security.
    The M1 is not such a beast of performance or whatever when you consider its price (and when you run something on it other than just Apple benchmarks) and compare it with other offerings at the same pricetag, so why bother making your own hardware when there already are equivalent or better choices out there?
    Repeating non sense does not make it true!!!!!! I'm sitting right now in front of an M1 Air that is PASSIVELY cooled and I can't think of a single machine that competes with it in any price range. That is offers the same level of performance in a fanless laptop. This machine highlights why I was so interested in getting an Arm based laptop that doesn't suck in the first place.
    Because that's the only way to keep the ecosystem closed enough.
    It is somewhat closed not completely closed. Frankly it is more open than your average Chrome running ARM based notebook and much faster to boot.
    Take for example old Windows-only GDI printers: it was a whole different story. In that case the goal was to make a printer cost less than its ink cartridge, so almost no RAM, almost no CPU, almost anything onboard: just the bare minimum to talk to a Windows-only GDI driver. It took reverse engineering to have those paperweights somewhat working outside of Windows, but the hardware maker didn't get in the way on purpose, they just targeted a specific market that wasn't Linux users in order to cut production costs.

    Apple is different. They think different, you know. They think how to keep their users from using anything non-Apple.
    Baloney! There is a lot of hardware that works perfectly fine on Apples hardware including printers. Not all vendors support Apples hardware and and operating systems but then again there is third party Apple specific hardware that barely runs on other platforms. It isn't up to Apple to get third parties to run on their OS it is up to the hardware developer. Just like in the Windows world there is all sorts of hardware that will not run under Linux or Mac OS. Given that I have to say that most of the hardware that can run on Mac OS has far better drivers than what is in the Windows world.

    From my point of view you have many non arguments here or complete misunderstandings about Apple Silicon. M1 isn't perfect and frankly has significant short comings for much of Apples hardware needs (thus the low end machines), but the hardware and software combo is far better than you are alluding too. In fact I'm rather stunned at just how well Apple has been able to pull off this platform change. The volume of software that is already ARM native is surprising and the stuff that isn't often runs very well on the machines. Objectively it is hard not to be impressed by what Apple has achieved.

    Now would I prefer an ARM based laptop running Linux with all native drivers - most likely I would. In fact this M1 based Apple hardware has me salivating at the thought. If Apple drags the rest of the industry over to ARM, I will be very happy indeed. The reality is that I've never had an x86 platform, with the required performance, that has the battery life these systems have. Never! More so I don't see such a machine coming anytime soon. Though it has become a bit of a joke, x86 laptops only run cool when doing nothing, they are otherwise power hogs.

    I can actually see Apple adding a strip of solar cells along the top of the keyboards and actually getting useful power relative to the rest of the system. They will not be able to eliminate batteries anytime soon but should be able to offer real run time extension in many situations. That is how good Apple Silicon is and will only get better moving forward. Even if that strip only offered one watt of power it would positively impact battery run time per charge. In a very real sense we have a new generation of hardware that can lead to a new way of looking at devices. I just want to see such capabilities in the Linux world, so if somebody hacking drivers, for M1, can help lead to that I'm all for it. Would I prefer a more open ARM chip with better software support from the vendor - certainly (I hope AMD is listening) - but sometime companies need a kick in the virtual pants to step outside of their comfort zones. Who knows maybe AMD collaboration with Samsung (I think) will lead to an ARM based SoC with open GPU hardware. Of course we still need to get a builder to put that chip in a laptop.

    Leave a comment:


  • Vistaus
    replied
    Originally posted by stingray454 View Post
    Still great news. While I don't particularly care for Apple, I must admit that their M1 chips are impressive. With rumors about M1X / M2 and so on, and computers with more ports and such, it's an interesting option. Sure I'd love to see other ARM-based laptops with similar performance, especially for emulating x86 software which we probably will need to for a while longer.. But as it stands, it seems to be 1-2 years before we even have something competitive on the market. Would consider an M1 if/when the hardware is supported, so great to see some progress.
    More ports? But everyone, even Apple fans, is always saying that Apple won't add any more ports to their products…

    Leave a comment:


  • wizard69
    replied
    Originally posted by GruenSein View Post
    I find it baffling that there are always some people who get really upset because they think that Apple intentionally breaks OSS support or cripples OpenGL/Vulkan performance when the easiest explanation is that they simply do not care about OSS or whether something outside of their ecosystem breaks.
    Regarding the lacking hardware features: In the end, the goal of all GPUs is the same so they certainly provide some way to do it in an efficient fashion. The question is not whether or not it can run stuff people do with Vulkan. Vulkan is just a way of telling the GPU what to do. Same goes for Metal. The question is how easy it is to translate the intention phrased in Vulkan to something the M1 understands.
    There isn't a huge conceptual difference between Metal and Vulkan either.

    As for Apples ARM based hardware I don't see them breaking stuff on purpose, I see them breaking stuff to reach goals. The complete deletion of 32 bit support is a perfect example. This allowed them to achieve goals that likely included elimination of a lot of circuitry related to 32 bit support with th end goal much lower power usage. This shouldn't surprise anybody because they where telegraphing the elimination of 32 bit hardware for years at WWDC. In the same manner they pushed developers really hard to adopt API's knowing that underlying hardware would change. They actually said so in at least a few WWDC videos. So anybody complaining about changing hardware from Apple simply hasn't been paying attention or simply can't digest the information being feed to them.

    Apple does a lot of disgusting things but frankly they haven't had a problem when it comes to hardware. They literally told developers that hardware will be changing.

    Leave a comment:


  • GruenSein
    replied
    Originally posted by lucrus View Post

    Their "ecosystem" is their invention and they designed it the way it is exactly to keep others outside, it's not only casual lack of support for this or that OSS. Their "ecosystem" includes many "features" that are at best useless to the end user, but they're there nevertheless because they help keeping the thing hard or impossible to hack (in the positive meaning of the term).
    The M1 is not such a beast of performance or whatever when you consider its price (and when you run something on it other than just Apple benchmarks) and compare it with other offerings at the same pricetag, so why bother making your own hardware when there already are equivalent or better choices out there?
    Because that's the only way to keep the ecosystem closed enough.

    Apple is different. They think different, you know. They think how to keep their users from using anything non-Apple.
    Many reviews disagree with your assessment of the M1 and find it extremely fast and versatile considering its power figures. Since Apple has always focused heavily on thin and light devices, it provides many benefits over competing solutions. And even if you think that other solutions are at least on par or better, there is one more obvious reason to have an in-house solution: Keeping profits in-house. Why should Apple (or anyone for that matter) use third party products if they find that they can achieve adequate performance without anyone else taking a cut of the final profits? Samsung has been making subpar Exynos SoCs instead of simply buying Qualcomm for years for exactly this reason. It had nothing to do with "keeping their devices locked in" or whatever which is made fairly obvious since they also used Qualcomm SoCs. All I am saying is: If you think that Apple spends much time wondering what Linux users think, you are way off. Designing an entire (industry leading) SoC just so a few people cannot run Linux on their stuff anymore doesn't seem reasonable because hardly anyone buying their stuff even wants to do that. Linux users are such a tiny fraction in Apple's key customer base (home users) that they simply do not care whether or not you get Linux to run on it. Not caring also means that they build an SoC which does exactly what they want to offer to the customer (Metal accelerated software) without including stuff that might make writing Vulkan drivers easier. They won't help with alternative uses of their hardware because that help does not generate profits. However, that is different from actually hindering such efforts.
    Last edited by GruenSein; 19 April 2021, 10:43 AM.

    Leave a comment:


  • lucrus
    replied
    Originally posted by GruenSein View Post
    I find it baffling that there are always some people who get really upset because they think that Apple intentionally breaks OSS support or cripples OpenGL/Vulkan performance when the easiest explanation is that they simply do not care about OSS or whether something outside of their ecosystem breaks.
    Their "ecosystem" is their invention and they designed it the way it is exactly to keep others outside, it's not only casual lack of support for this or that OSS. Their "ecosystem" includes many "features" that are at best useless to the end user, but they're there nevertheless because they help keeping the thing hard or impossible to hack (in the positive meaning of the term).
    The M1 is not such a beast of performance or whatever when you consider its price (and when you run something on it other than just Apple benchmarks) and compare it with other offerings at the same pricetag, so why bother making your own hardware when there already are equivalent or better choices out there?
    Because that's the only way to keep the ecosystem closed enough.

    Take for example old Windows-only GDI printers: it was a whole different story. In that case the goal was to make a printer cost less than its ink cartridge, so almost no RAM, almost no CPU, almost anything onboard: just the bare minimum to talk to a Windows-only GDI driver. It took reverse engineering to have those paperweights somewhat working outside of Windows, but the hardware maker didn't get in the way on purpose, they just targeted a specific market that wasn't Linux users in order to cut production costs.

    Apple is different. They think different, you know. They think how to keep their users from using anything non-Apple.

    Leave a comment:


  • PCJohn
    replied
    Originally posted by ezst036 View Post
    What affect will these missing features have on potential gaming performance?
    As for reading vertex attributes - not any huge performance difference. The driver could insert instructions for reading vertex data into the vertex shader. This way, it will emulate the hardware. If I am not mistaken, Mantle also did not support special hardware to load vertex attributes, probably reflecting AMD not having dedicated hardware for this that time. But Nvidia has special hardware to load vertex attributes. Not sure about AMD these days.

    Leave a comment:


  • ezst036
    replied
    Originally posted by phoronix View Post
    For all the visible hardware features, it’s equally important to consider what hardware features are absent. Intriguingly, the GPU lacks some fixed-function graphics hardware ubiquitous among competitors. For example, I have not encountered hardware for reading vertex attributes or uniform buffer objects.
    What affect will these missing features have on potential gaming performance?

    Leave a comment:


  • ThoreauHD
    replied
    Beyond the half exposed apu, I don't know how they are going to get through the serialized encrypted components of that motherboard. Even repair people are having a hard time with it.

    Leave a comment:

Working...
X