Announcement

Collapse
No announcement yet.

Rust For The Linux Kernel Sent Out For Review A Fourth Time

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • shmerl
    replied
    Originally posted by jacob View Post

    I think "zero cost abstractions" doesn't have the same meaning for C++ and Rust. In C++, it means that a language feature that you don't use in your code should have no cost in the resulting binary. For example if you don't use inheritance then your data structures won't have any vtables; if you use [] rather than at() to access your vectors then the resulting assembly code will be identical as if it were plain C arrays, etc. Of course that's an ideal and there are many exceptions to this.

    In Rust "zero cost" is supposed to mean that when using the language's abstractions, the resulting assembly should ideally be as efficient and optimised as if you wrote low-level code and did everything manually. That's why you can create complex operations using iterator combinators and it will (ideally) compile to the same thing as a plain old for loop in C (without unnecessary bounds checks), your code will autovectorise to take advantage of SSE, AVX etc as if you wrote it by hand using intrinsics, and so forth. Again, this is an ideal and it doesn't always work out that way, but increasingly often it does. It's the goal and the compiler gets better and better at it as time goes by.
    Yeah, the terminology is a bit mixed up, but I mean it in Rust's way. C++ doesn't attempt doing that.

    Rust also tries to avoid hidden performance overhead caused by syntax, which results in more verbose than C++ expressions sometimes (that's why there are no neat array / vector initializers in Rust for example and they prefer to use macros for that). That is also in line with their zero cost abstraction approach.
    Last edited by shmerl; 13 February 2022, 10:16 PM.

    Leave a comment:


  • jacob
    replied
    Originally posted by shmerl View Post

    Type safety probably is the biggest one. But lack of more powerful abstractions is also a major pain point. Rust at least attempts to provide what they call "zero cost" abstractions. Which in theory should be better than both C (not enough abstractions) and C++ (not zero cost abstractions).

    The main premise of Rust is to allow producing both performant and memory safe code, which neither C nor C++ can do both at the same time.
    I think "zero cost abstractions" doesn't have the same meaning for C++ and Rust. In C++, it means that a language feature that you don't use in your code should have no cost in the resulting binary. For example if you don't use inheritance then your data structures won't have any vtables; if you use [] rather than at() to access your vectors then the resulting assembly code will be identical as if it were plain C arrays, etc. Of course that's an ideal and there are many exceptions to this.

    In Rust "zero cost" is supposed to mean that when using the language's abstractions, the resulting assembly should ideally be as efficient and optimised as if you wrote low-level code and did everything manually. That's why you can create complex operations using iterator combinators and it will (ideally) compile to the same thing as a plain old for loop in C (without unnecessary bounds checks), your code will autovectorise to take advantage of SSE, AVX etc as if you wrote it by hand using intrinsics, and so forth. Again, this is an ideal and it doesn't always work out that way, but increasingly often it does. It's the goal and the compiler gets better and better at it as time goes by.

    Leave a comment:


  • shmerl
    replied
    Originally posted by STiAT View Post

    While I agree, knowing some of your technical background I'd like to know what design problems come to your mind. For me most are the use of outdated designs and actually not having type safety, which I agree, are only a thing due to backwards compatibility. As you can use void* pointers for interfaces, you probably should not (very simple example, but actually a real life experienced one with mathematicians designing interfaces ).
    Type safety probably is the biggest one. But lack of more powerful abstractions is also a major pain point. Rust at least attempts to provide what they call "zero cost" abstractions. Which in theory should be better than both C (not enough abstractions) and C++ (not zero cost abstractions).

    The main premise of Rust is to allow producing both performant and memory safe code, which neither C nor C++ can do both at the same time.
    Last edited by shmerl; 13 February 2022, 09:46 PM.

    Leave a comment:


  • STiAT
    replied
    Originally posted by shmerl View Post

    Old, rather than mature. C is also riddled with dated design problems that can't be fixed due to the need for backwards compatibility. C is good, but not good enough in the sense of language design progress. Rust is better.
    While I agree, knowing some of your technical background I'd like to know what design problems come to your mind. For me most are the use of outdated designs and actually not having type safety, which I agree, are only a thing due to backwards compatibility. As you can use void* pointers for interfaces, you probably should not (very simple example, but actually a real life experienced one with mathematicians designing interfaces ).
    Last edited by STiAT; 13 February 2022, 09:37 PM.

    Leave a comment:


  • shmerl
    replied
    Originally posted by jacob View Post
    Those are just a few examples for those who believe that C resulted from some deep philosophical epiphany about programming languages. It didn't, it was a quick and dirty hack to write *some* compiler that could work on and produce code for the PDP11.
    Some major versions of C also introduce subtle differences that can be considered breaking, but still, it's not fundamentally changing some core concepts that are causing problems.

    Leave a comment:


  • jacob
    replied
    Originally posted by shmerl View Post

    Old, rather than mature. C is also riddled with dated design problems that can't be fixed due to the need for backwards compatibility. C is good, but not good enough in the sense of language design progress. Rust is better.
    Interestingly, C in fact had such design problems that they had no choice but to break backwards compatibility. In the initial version of C, the +=, -= etc. operators were written backwards, e.g. =+, =- etc. Question: what does a=-b mean?

    They changed it and broke existing code because they finally admitted that the language was so confusing that it was almost unusable.

    There is more of course. The inane, non-type safe function parameters declaration syntax from the pre-ANSI versions ("K&R" style) - here goes another breaking change. The pointer declaration semantics that are too clever by half: many people don't realise this, but declaring int *a doesn't actually mean that a us a pointer to an int...

    Those are just a few examples for those who believe that C resulted from some deep philosophical epiphany about programming languages. It didn't, it was a quick and dirty hack to write *some* compiler that could work on and produce code for the PDP11.

    Leave a comment:


  • STiAT
    replied
    Originally posted by jacob View Post

    You are fighting a losing battle my friend. Those are people who think that before the C language and Unix there was Nothingness(tm), and that those two are also forever the last word in programming language and OS design, respectively. Anything that exists or has existed outside of them shall not be named, and any attempt to evolve beyond them is strictly Verboten.
    I actually still developed in System 360/370 and PL/X. Since most programs people who still know those .. they're highly requested people. To maintain the stuff and to port it. I personally don't see a bright future for those, but actually - they still exist, and companies (especially banks) pay high salaries for people with skills in this field, who can maintain and help to port those. I even thought Cobol/BS2000 out of the picture until I realized they still exist (sure, Cobol has little to do with OS development, but I still wonder how they still exist, and how the *** they can still maintain BS2000 systems, the hardware should long be gone).

    For those not knowing, BS2000 was released 1975, and even has a current release as of 2021.
    Last edited by STiAT; 13 February 2022, 09:15 PM.

    Leave a comment:


  • shmerl
    replied
    Originally posted by betty567 View Post

    OS kernels are a low level endeavor. C is an extremely mature and successful low level language.
    Old, rather than mature. C is also riddled with dated design problems that can't be fixed due to the need for backwards compatibility. C is good, but not good enough in the sense of language design progress. Rust is better.

    Leave a comment:


  • jacob
    replied
    Originally posted by CommunityMember View Post

    Actually, the most successful OSs (and their predecessors) were written in System 360/370/ESA/Z assembly language and (in some cases) PL/X. z/OS, z/VM, z/TPF, z/VSE are the current variants. While you may never have heard of those OS's, they do run a significant part of the underlying infrastructure and would be considered advanced and successful.
    You are fighting a losing battle my friend. Those are people who think that before the C language and Unix there was Nothingness(tm), and that those two are also forever the last word in programming language and OS design, respectively. Anything that exists or has existed outside of them shall not be named, and any attempt to evolve beyond them is strictly Verboten.

    Leave a comment:


  • STiAT
    replied
    I have written some device drivers in C, and I do not think Rust will help me a lot implementing, but yes, on the side of memory and thread safety.. yeah. It happens that you just oversee the obvious buffer overflow or race. It happens, no matter how experienced you are. It should not, but it does.

    Time spent for me was hardly on the issues with C as a language but with unpredictable hardware and bad documentation, where it just does stuff it's not supposed to do. Rust will not change that.

    I'd think on using Rust too, but for me Rust in the past years was waiting for features and fixes in stable. And my guess is this patches will have the same issue (I didn't even screen them, so it's just a guess).

    When there is a version of Rust they deem fit for the task.. sure, I'd consider it. There are better and more knowledgeable people to decide where it makes sense and where not than me, and they know where their safety issues primarily come from. I don't have the full picture, only the perspective of somebody having developed device drivers, for me it would make sense. Not more work, and kills the issue of me and colleagues making stupid mistakes.

    And for haters of Redox, I like what they're doing. They are not close to anywhere as of the time writing of becoming competitive, but it could one day in certain markets. Linux has a long history, Redox is young. For me it's a nice proof of concept. If it will have any mainstream real life application in industry or private use is to be seen. But with Linux around it will certainly be hard, just by the manpower and industry focus today behind the Linux kernel. But Linus himself said he's waiting for the next big thing to replace Linux. It may not be Redox, but something will come up some day. I may not live to see it though.
    Last edited by STiAT; 13 February 2022, 08:40 PM.

    Leave a comment:

Working...
X