Announcement

Collapse
No announcement yet.

Asahi Linux May Pursue Writing Apple Silicon GPU Driver In Rust

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • erniv2
    replied
    IsnĀ“t this topic getting "Rusty"

    Leave a comment:


  • andrei_me
    replied
    Originally posted by jaxa View Post

    You mean Hector Martin?
    Not in this case, the GPU side are spearheaded by Alyssa Rosenzweig (https://twitter.com/alyssarzg) and Asahi Lina (https://twitter.com/LinaAsahi)

    The interesting part is that Lina livestream the development sessions and they are recorded so anyone can watch the +100h of work spent on this and follow the train of thought, decisions along the way etc

    Leave a comment:


  • mdedetrich
    replied
    Originally posted by ferry View Post

    What I mean is if you are investing money into something and you are introducing 2 large risks, you are likely to loose your money. Now, as a hobbyist you don't care as you put in mostly your spare time. That is not a bad thing. Now, indeed GCC and linux itself were at the time considered hobbyist projects.

    At that time there were commercial unix'es (HP-UX comes to mind) and associated c compilers. Someone, at the beginning, took the risk of developing hardware, OS and compilers/assemblers at the same time. And succeeded, congrats. But that was at the time when you needed to punch in the machine code instructions to boot your PDP8.

    The first version of unix were written in assembly. And only later rewritten in C. All step-by-step. And even then, the work done by labs (like Bell) and universities (like MIT).
    If that's your standard for being proven then you shouldn't have any issues with Rust.

    Leave a comment:


  • Developer12
    replied
    Originally posted by kylew77 View Post
    I don't like to learn new things so hate Rust, but actually have a technical reason on why Rust is the wrong choice: other operating systems working on the M1 or M2. OpenBSD has a port for the Mac M1 and I think NetBSD does too. They will never accept a rust driver into their OS so why not stick to C so everybody can benefit?
    BSD etc will have to rewrite the kernel driver anyway, as usual. Still, the linux implementation (and it's copious documentation) will be educational for them.

    Leave a comment:


  • Developer12
    replied
    Originally posted by Dukenukemx View Post
    This doesn't replace Mac OSX in the internal SSD? Is that a choice or Apple prevents you from doing this?
    The prototype driver is a hack that requires two computers tied together over usb. The actual asahi install has always been installable alongside macos, just as you can install multiple versions of macos at once.

    Edit: the ability to install multiple macos/linux/whatever OSs on the same SSD in the same system is something that apple *explicitly* engineered. They spend considerable time and effort producing a system that enables parallel install of "trusted" and "untrusted" OSs and this is an officially supported feature. (not to mention removing it would break backwards compatibility with previous versions of macos)
    Last edited by Developer12; 12 August 2022, 12:07 PM.

    Leave a comment:


  • ferry
    replied
    Originally posted by Vistaus View Post

    Hobby-ism, really? You do know that C and GCC weren't "proven" right from the get-go either, right?
    What I mean is if you are investing money into something and you are introducing 2 large risks, you are likely to loose your money. Now, as a hobbyist you don't care as you put in mostly your spare time. That is not a bad thing. Now, indeed GCC and linux itself were at the time considered hobbyist projects.

    At that time there were commercial unix'es (HP-UX comes to mind) and associated c compilers. Someone, at the beginning, took the risk of developing hardware, OS and compilers/assemblers at the same time. And succeeded, congrats. But that was at the time when you needed to punch in the machine code instructions to boot your PDP8.

    The first version of unix were written in assembly. And only later rewritten in C. All step-by-step. And even then, the work done by labs (like Bell) and universities (like MIT).

    Leave a comment:


  • Vistaus
    replied
    Originally posted by ferry View Post

    C language and GCC is a proven tool for device driver development. Rust is not a proven tool for device driver development, in fact there is no kernel code at all, except some very trivial example.

    Writing a driver for undocumented hardware is challenging, doing so with an experimental compiler is ... hobby-ism?
    Hobby-ism, really? You do know that C and GCC weren't "proven" right from the get-go either, right?

    Leave a comment:


  • Vistaus
    replied
    Originally posted by stormcrow View Post
    You can claim bad code all you want, but it is inevitable that humans will write flawed code. No one, no matter how big their ego is, ever writes software more than "Hello World!" without errors.
    Except for the people that write "Hello Wolrd!".

    Leave a comment:


  • Vermilion
    replied
    You can always count on fellow Phoronix commenters to know better than the one doing the actual work..

    Leave a comment:


  • pabloski
    replied
    Originally posted by mdedetrich View Post

    The unsafe part is the driver going to be addressing the necessary GPU registers/memory locations directly and/or directly doing underlying syscalls. This is always going to be unsafe
    I want to add something on this point, considering that there is a bit of confusion about unsafeness in Rust, a thing that always produces threads with hundreds of angry anti-Rust comments.

    The unsafe in Rust, means that the compiler cannot guarantee anything about the inner workings of the code and the accesses to the state of the components manipulated by that code.

    It basically means that the programmer is on his own and must pay A LOT of attention to what his code does. But in engineering there is a practical principle stating that small systems are easy to reason about. This is why unsafe code MUST be small, do as fewer things as possible and the programmer must spend the necessary time testing and reviewing the code and how it interacts with the "external" world. Possibly defining clear protocols by which code components share state.

    And as soon as possible, encapsulate this code into safe interfaces, so that users of the code can use it safely.

    This is the greatest revolution Rust puts on the table. If we take strings in Rust, they are just vectors. They are safe from the "storage" point of view, because they use Vec that encapsulates the unsafeness of building and accessing dynamic length sequences of elements and makes working with them safe.

    But strings in Rust are UTF-8 encoded too. And UTF-8 has specify rules about what constitutes valid codepoints and sequences of codepoints. So, internally, strings are unsafe from the point of view of what codepoints are stored inside them, meaning that the code that implements strings spares no effort to verify that what is stored into a string is legal UTF-8. And again, it encapsulates this unsafe behavior into safe interfaces. This means that users of the String type can use it safely.

    Obviously everything can have bugs, even unsafe Rust code. But it is easier to find and squash bugs in small, well defined portions of code. And as soon as you encapsulate them into safe interfaces, you can leverage the Rust compiler to guarantee that the users of your "library code" can only use it safely.

    So unsafe doesn't mean that everything using unsafe code is unsafe ( a common misconception a lot of Rust haters seem to believe ). This would defeat the reason d'etre behind Rust.
    Last edited by pabloski; 12 August 2022, 09:42 AM.

    Leave a comment:

Working...
X