Announcement
Collapse
No announcement yet.
More Progress Is Made Understanding Apple's M1 GPU, Working Towards An Open Driver
Collapse
X
-
I wouldn't wonder Apple designed the GPU as much as possible around patents.
- Likes 2
-
Originally posted by Vistaus View Post
More ports? But everyone, even Apple fans, is always saying that Apple won't add any more ports to their products…
Until the advent of M1 Apples Mac line has been pretty stagnant to the point I had to wonder if they where trying to kill the entire line off. So the hope is that M1 and the processor moving forward will lead to a rapid advancement of the hardware offering and frankly far more innovation. For example why does an M1 computer have to be any thing more substantial than a Keyboard? Or why not make a Mini with a real discreet GPU that doesn't suck. The thermal envelope of these processors just opens up a massive number of possibilities with new design.
Dave
- Likes 1
Leave a comment:
-
Originally posted by lucrus View Post
Their "ecosystem" is their invention and they designed it the way it is exactly to keep others outside, it's not only casual lack of support for this or that OSS. Their "ecosystem" includes many "features" that are at best useless to the end user, but they're there nevertheless because they help keeping the thing hard or impossible to hack (in the positive meaning of the term).
The M1 is not such a beast of performance or whatever when you consider its price (and when you run something on it other than just Apple benchmarks) and compare it with other offerings at the same pricetag, so why bother making your own hardware when there already are equivalent or better choices out there?
Because that's the only way to keep the ecosystem closed enough.
Take for example old Windows-only GDI printers: it was a whole different story. In that case the goal was to make a printer cost less than its ink cartridge, so almost no RAM, almost no CPU, almost anything onboard: just the bare minimum to talk to a Windows-only GDI driver. It took reverse engineering to have those paperweights somewhat working outside of Windows, but the hardware maker didn't get in the way on purpose, they just targeted a specific market that wasn't Linux users in order to cut production costs.
Apple is different. They think different, you know. They think how to keep their users from using anything non-Apple.
From my point of view you have many non arguments here or complete misunderstandings about Apple Silicon. M1 isn't perfect and frankly has significant short comings for much of Apples hardware needs (thus the low end machines), but the hardware and software combo is far better than you are alluding too. In fact I'm rather stunned at just how well Apple has been able to pull off this platform change. The volume of software that is already ARM native is surprising and the stuff that isn't often runs very well on the machines. Objectively it is hard not to be impressed by what Apple has achieved.
Now would I prefer an ARM based laptop running Linux with all native drivers - most likely I would. In fact this M1 based Apple hardware has me salivating at the thought. If Apple drags the rest of the industry over to ARM, I will be very happy indeed. The reality is that I've never had an x86 platform, with the required performance, that has the battery life these systems have. Never! More so I don't see such a machine coming anytime soon. Though it has become a bit of a joke, x86 laptops only run cool when doing nothing, they are otherwise power hogs.
I can actually see Apple adding a strip of solar cells along the top of the keyboards and actually getting useful power relative to the rest of the system. They will not be able to eliminate batteries anytime soon but should be able to offer real run time extension in many situations. That is how good Apple Silicon is and will only get better moving forward. Even if that strip only offered one watt of power it would positively impact battery run time per charge. In a very real sense we have a new generation of hardware that can lead to a new way of looking at devices. I just want to see such capabilities in the Linux world, so if somebody hacking drivers, for M1, can help lead to that I'm all for it. Would I prefer a more open ARM chip with better software support from the vendor - certainly (I hope AMD is listening) - but sometime companies need a kick in the virtual pants to step outside of their comfort zones. Who knows maybe AMD collaboration with Samsung (I think) will lead to an ARM based SoC with open GPU hardware. Of course we still need to get a builder to put that chip in a laptop.
- Likes 1
Leave a comment:
-
Originally posted by stingray454 View PostStill great news. While I don't particularly care for Apple, I must admit that their M1 chips are impressive. With rumors about M1X / M2 and so on, and computers with more ports and such, it's an interesting option. Sure I'd love to see other ARM-based laptops with similar performance, especially for emulating x86 software which we probably will need to for a while longer.. But as it stands, it seems to be 1-2 years before we even have something competitive on the market. Would consider an M1 if/when the hardware is supported, so great to see some progress.
Leave a comment:
-
Originally posted by GruenSein View PostI find it baffling that there are always some people who get really upset because they think that Apple intentionally breaks OSS support or cripples OpenGL/Vulkan performance when the easiest explanation is that they simply do not care about OSS or whether something outside of their ecosystem breaks.
Regarding the lacking hardware features: In the end, the goal of all GPUs is the same so they certainly provide some way to do it in an efficient fashion. The question is not whether or not it can run stuff people do with Vulkan. Vulkan is just a way of telling the GPU what to do. Same goes for Metal. The question is how easy it is to translate the intention phrased in Vulkan to something the M1 understands.
As for Apples ARM based hardware I don't see them breaking stuff on purpose, I see them breaking stuff to reach goals. The complete deletion of 32 bit support is a perfect example. This allowed them to achieve goals that likely included elimination of a lot of circuitry related to 32 bit support with th end goal much lower power usage. This shouldn't surprise anybody because they where telegraphing the elimination of 32 bit hardware for years at WWDC. In the same manner they pushed developers really hard to adopt API's knowing that underlying hardware would change. They actually said so in at least a few WWDC videos. So anybody complaining about changing hardware from Apple simply hasn't been paying attention or simply can't digest the information being feed to them.
Apple does a lot of disgusting things but frankly they haven't had a problem when it comes to hardware. They literally told developers that hardware will be changing.
- Likes 6
Leave a comment:
-
Originally posted by lucrus View Post
Their "ecosystem" is their invention and they designed it the way it is exactly to keep others outside, it's not only casual lack of support for this or that OSS. Their "ecosystem" includes many "features" that are at best useless to the end user, but they're there nevertheless because they help keeping the thing hard or impossible to hack (in the positive meaning of the term).
The M1 is not such a beast of performance or whatever when you consider its price (and when you run something on it other than just Apple benchmarks) and compare it with other offerings at the same pricetag, so why bother making your own hardware when there already are equivalent or better choices out there?
Because that's the only way to keep the ecosystem closed enough.
Apple is different. They think different, you know. They think how to keep their users from using anything non-Apple.Last edited by GruenSein; 19 April 2021, 10:43 AM.
- Likes 7
Leave a comment:
-
Originally posted by GruenSein View PostI find it baffling that there are always some people who get really upset because they think that Apple intentionally breaks OSS support or cripples OpenGL/Vulkan performance when the easiest explanation is that they simply do not care about OSS or whether something outside of their ecosystem breaks.
The M1 is not such a beast of performance or whatever when you consider its price (and when you run something on it other than just Apple benchmarks) and compare it with other offerings at the same pricetag, so why bother making your own hardware when there already are equivalent or better choices out there?
Because that's the only way to keep the ecosystem closed enough.
Take for example old Windows-only GDI printers: it was a whole different story. In that case the goal was to make a printer cost less than its ink cartridge, so almost no RAM, almost no CPU, almost anything onboard: just the bare minimum to talk to a Windows-only GDI driver. It took reverse engineering to have those paperweights somewhat working outside of Windows, but the hardware maker didn't get in the way on purpose, they just targeted a specific market that wasn't Linux users in order to cut production costs.
Apple is different. They think different, you know. They think how to keep their users from using anything non-Apple.
- Likes 5
Leave a comment:
-
Originally posted by ezst036 View PostWhat affect will these missing features have on potential gaming performance?
Leave a comment:
-
Originally posted by phoronix View PostFor all the visible hardware features, it’s equally important to consider what hardware features are absent. Intriguingly, the GPU lacks some fixed-function graphics hardware ubiquitous among competitors. For example, I have not encountered hardware for reading vertex attributes or uniform buffer objects.
Leave a comment:
-
Beyond the half exposed apu, I don't know how they are going to get through the serialized encrypted components of that motherboard. Even repair people are having a hard time with it.
Leave a comment:
Leave a comment: