Announcement

Collapse
No announcement yet.

Asahi Linux May Pursue Writing Apple Silicon GPU Driver In Rust

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Asahi Linux May Pursue Writing Apple Silicon GPU Driver In Rust

    Phoronix: Asahi Linux May Pursue Writing Apple Silicon GPU Driver In Rust

    When it comes to the Apple M1 and M2 support on Linux, one of the biggest obstacles to suitable daily use for end-users is the current lack of GPU acceleration. Reverse engineering has been happening for the Apple Silicon graphics processor, early experiments being carried out under macOS and Asahi's m1n1 environment, and the next step will be to start writing a Direct Rendering Manager (DRM) kernel driver. To some surprise, the feasibility of writing this DRM kernel GPU driver in the Rust programming language is being explored...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Then someone will have to rewrite it in C, I guess...

    Comment


    • #3
      Originally posted by marios View Post
      Then someone will have to rewrite it in C, I guess...
      To introduce crashes and security issues.

      Comment


      • #4
        Originally posted by Alliancemd View Post

        To introduce crashes and security issues.
        It looks like some rustafarians take memes too seriously...

        Comment


        • #5
          Since Apple's GPUs are based on Imagination's PowerVR, maybe they could take a look at that and see if adapting it will work on Apple GPU?

          Comment


          • #6
            Originally posted by marios View Post
            Then someone will have to rewrite it in C, I guess...
            (Notice that the dev needs already to rewrite it from Python to Rust).

            C: Not necessarily.

            One of the big argument for preferring the classic C interface for device drivers rather than Rust, is that GCC (and LLVM as this getting another compiler able to compile the kernel) are available for nearly every instruction set under the sun (in fact even more than Linux has been ported or even could run on), whereas as Rust is currently only available for a couple of arch. Thus Linux kernel is available on more architecture than Rust is available for.

            Under normal circumstances, one would want to avoid writing, say a drivers from some PCI device (e.g.: a GPU) in a language that will only compile on x86_64 and Aarch64, but you cannot get a compiler for some other CPU arch (e.g.: some opensparc/leon, some modern SH4, etc.) despite those also possibly having a PCI bus, as this would render the driver useless for people using those architecture despite still being to connect the PCI device to their machine (and also for that weird guy with the special adaptor board for his Amiga 4000 that allows its 68040 to access PCI graphics cards).

            BUT

            In the specific case of Asahi, the GPU is a core that is part of the Apple M1/M2's chips line. This GPU is always found exclusively together with a CPU that runs Aarch64 - for which there is a Rust compiler.
            Thus one will always be to compile the drivers for any hardware that has it, even if this is not supporting the full zoo of CPU architectures that GCC and Linux run on.

            (well, save for some very extreme and weird package de-lidding experiments)

            (also, note that if Apple decides to switch arch yet again in the future and join the Risc-V bandwagon, that is also supported by the Rust compiler).

            Comment


            • #7
              Originally posted by Dukenukemx View Post
              Since Apple's GPUs are based on Imagination's PowerVR, maybe they could take a look at that and see if adapting it will work on Apple GPU?
              The only opensource driver from Imagination is for their newest mobile GPU, and it isn't even ready yet. So first basing it off of something that doesn't yet work might be risky, but also it is interesting how much of their IP has been used in Apple's chips, when did the split occur between Apple and PowerVR and if they are even based on the same technology. But yeah, unifying efforts would be really helpful.
              As for using Rust, I guess this is one of the best occasions to try a new approach, a driver written completely from scratch.

              Comment


              • #8
                Originally posted by Alliancemd View Post

                To introduce crashes and security issues.
                Caused by bad code.

                Comment


                • #9
                  Originally posted by marios View Post

                  It looks like some rustafarians take memes too seriously...
                  Actually marios spent 30+ pages on Phoronix arguing in favour of C even though he has no real programming experience and most of his justifications boil down to "I don't like it".

                  So yeah in this case its not a meme, he is dead serious (and also dead wrong).

                  Originally posted by tildearrow View Post

                  Caused by bad code.
                  Created by bad programmers picking bad languages which have much higher chances creating bad code.

                  Programmers willingly choosing a worse tool for the job is not a sign of competence.
                  Last edited by mdedetrich; 11 August 2022, 04:01 PM.

                  Comment


                  • #10
                    Originally posted by tildearrow View Post

                    Caused by bad code.
                    Which can be exacerbated by languages (like C) that not only give you the gun, but loads it for you with magnum ammunition and encourages you to aim for your feet.

                    You can claim bad code all you want, but it is inevitable that humans will write flawed code. No one, no matter how big their ego is, ever writes software more than "Hello World!" without errors. It's debatable that people can write a hello world program indefinitely over and over without any error without using copy/paste. The less footguns the language has built in, the less likely you'll be shooting yourself in the foot. That's the premise of langsec and it's borne out by decades of research, not to mention daily cybersecurity headlines.

                    Arguing otherwise is literally arguing against science much like anti-vaxxers latch onto any pseudo-science or feel-good argument that seems to support their otherwise unsupportable stance.

                    Comment

                    Working...
                    X