Announcement

Collapse
No announcement yet.

Latest Patches Sent Out For Adding Rust Support To The Linux Kernel

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • oleid
    replied
    Originally posted by ultimA View Post

    If there is a "bottleneck on the time of skilled bug-finders doing the looking", that means less bugs will be found, because basically there is a shortage of human resources. So if less bugs are being found due to a human-resource problem, moreover in a software with less bugs to begin with (because it now has Rust parts), that should result in an even more significant decrease in the number of CVEs, not in the numbers staying constant as you conclude. So I think you have a logic error in your reasoning, and thus this is not the explanation. Unless I misinterpreted something in what you said, in this case please do correct me.
    Na, I don't think it works that way.
    Bugs are looked for if there is a bug report. Assuming we have the same amount of bug hunters as before adding rust to firefox (which was the original premise), you argue bugs are harder to find because there are fewer.

    One could also argue: bugs are easier findable because it is more obvious where to look for bugs (unsafe blocks, or the C++/rust language interface) or the compiler is more helpful.

    But let's face it: anything is possible and somebody should do a research paper about it.

    Leave a comment:


  • ultimA
    replied
    Originally posted by ssokolow View Post

    How about this:

    CVEs don't correlate with the number of bugs present. They correlate with the number of bugs discovered. That means that, if the number of bugs discovered is heavily bottlenecked on the time of skilled bug-finders doing the looking, you may not see a change in the number of CVEs after a partial rewrite, since supply of bugs to be found still greatly outstrips the supply of bug-seekers.
    If there is a "bottleneck on the time of skilled bug-finders doing the looking", that means less bugs will be found, because basically there is a shortage of human resources. So if less bugs are being found due to a human-resource problem, moreover in a software with less bugs to begin with (because it now has Rust parts), that should result in an even more significant decrease in the number of CVEs, not in the numbers staying constant as you conclude. So I think you have a logic error in your reasoning, and thus this is not the explanation. Unless I misinterpreted something in what you said, in this case please do correct me.

    Leave a comment:


  • ssokolow
    replied
    Originally posted by ultimA View Post
    And none of you has provided an explanation why this shouldn't be the case, only just blank claims that "it isn't and that's it".
    How about this:

    CVEs don't correlate with the number of bugs present. They correlate with the number of bugs discovered. That means that, if the number of bugs discovered is heavily bottlenecked on the time of skilled bug-finders doing the looking, you may not see a change in the number of CVEs after a partial rewrite, since supply of bugs to be found still greatly outstrips the supply of bug-seekers.

    Leave a comment:


  • mdedetrich
    replied
    Originally posted by ultimA View Post
    mdedetrich You come and accuse people of being deceptive, shoddy, not having even the faintest understanding etc. but the fact is it is perfectly logical: If you replace a significant percentage of a codebase with code that in theory eliminates a large category of errors, then it is perfectly sane logic to expect that the reduction of errors should be visible in statistics after the rewrite.
    No its not, because its also entirely possible that the code replaced didn't happen to have any bugs because most were ironed out over 10-15 years with plenty of bugs reports/security breaches/CVE's in the past (we are talking about Firefox here). The difference here being that it took Firefox 10-15 years to discover/get rid of said bugs where as a lot of those wouldn't have even compiled in Rust.

    Also in some cases, the Rust code replacing the Firefox code wasn't like for like, i.e. the multithreaded/GPU CSS render Quantum was replacing a basic single threaded/CPU CSS render (which stayed single threaded because of how terrible C/C++ is when it comes to multithreaded code, a convenient thing you ignored when claiming that C++ is as good as Rust, its not). The C++ code in Firefox was so bad that people were scared to touch it when it came to major features because of a high chance of introducing security vulnerabilities/bugs which is actually funnily enough the reason why those same Firefox devs came up with Rust, its because of how crappy it was dealing with C++ (and Firefox is one of the biggest C++ codebases out there, aside from Chromium).

    Great job demonstrating how both of you guys are more concerned about proving a point rather than being accurate, queue in a golf clap.

    Leave a comment:


  • ultimA
    replied
    mdedetrich You come and accuse people of being deceptive, shoddy, not having even the faintest understanding etc. but the fact is it is perfectly logical: If you replace a significant percentage of a codebase with code that in theory eliminates a large category of errors, then it is perfectly sane logic to expect that the reduction of errors should be visible in statistics after the rewrite. Yet it isn't. And none of you has provided an explanation why this shouldn't be the case, only just blank claims that "it isn't and that's it". That's real bias, but from your part! Give a logical explanation, until then you seem the be the one being deceptive and shoddy (to use your own words). Of course the conclusion wasn't "shot down the drain" right in the beginning, owning to the fact that nobody could yet provide any other reasonable explanation. I'm not saying there isn't any, but if there is, it is obviously counterintuitive and non-obvious, and since you yourself haven't yet been to explain it, despite taking part in the discussion actively for some time, go and shove your personal insults "down the drain".

    The statistics are noisy or wrong? Don't they truly represent the portion of bugs? Possibly, but then how come you use the same statistics to point out how buggy C++ software is? The answer is of course, the statistics are correct when it proves your point, but become magically wrong otherwise. Again, stop with personal accusations and explain the phenomenon instead of being a stuck-up know-it-all who must be believed in.

    Leave a comment:


  • aspen
    replied
    Originally posted by Redfoxmoon View Post
    I'll get me popcorn.
    it's like a curse, almost any article related to Rust on this site will have a shitshow in the comments (and sometimes even someone complaining about the LLVM for no good reason!)

    Leave a comment:


  • mdedetrich
    replied
    Originally posted by moltonel View Post

    That's a uselessly weak proof. You have one bit of information (amount of rust code in Firefox) on a complex project and a very noisy bit of data (number of Firefox CVEs of all kinds) and deduce a causal link and even a simplistic explanation for the surprising data ("if it's not Rust memory bugs it must be Rust logic bugs"). This is a naive view on a complex subject, that fell prey to availability bias and motivated reasoning.

    FWIW, I find it easier to model my logic in Rust than in C++, suggesting less logic bugs in my Rust code, thanks to Rust enums (akin to C++ tagged union, but better and deeply idiomatic), Result type instead of exceptions, more explicit mutability, traits instead of inheritance, a stronger community push towards foolproof APIs, etc.
    I would argue its downright deceptive/misleading and anyone with a faint objective understanding of how statistics works would have shot such a shoddy conclusion down the drain.

    Leave a comment:


  • moltonel
    replied
    Originally posted by ferry View Post
    The statistics suggest memory bugs are reduced at the expense of logic bugs, resulting to approximately the same amount of CVE's.
    That's a uselessly weak proof. You have one bit of information (amount of rust code in Firefox) on a complex project and a very noisy bit of data (number of Firefox CVEs of all kinds) and deduce a causal link and even a simplistic explanation for the surprising data ("if it's not Rust memory bugs it must be Rust logic bugs"). This is a naive view on a complex subject, that fell prey to availability bias and motivated reasoning.

    FWIW, I find it easier to model my logic in Rust than in C++, suggesting less logic bugs in my Rust code, thanks to Rust enums (akin to C++ tagged union, but better and deeply idiomatic), Result type instead of exceptions, more explicit mutability, traits instead of inheritance, a stronger community push towards foolproof APIs, etc.

    Leave a comment:


  • ssokolow
    replied
    Originally posted by moltonel View Post
    Rust usually has tighter APIs, enabling devs to reason more locally without having to check foreign code or manually uphold invariants as much as C++.
    This phrase needs to be emphasized more, because it's one of the big, important design goals of Rust.

    Leave a comment:


  • mdedetrich
    replied
    Originally posted by ferry View Post

    The statistics suggest memory bugs are reduced at the expense of logic bugs, resulting to approximately the same amount of CVE's.
    Uh what?

    Leave a comment:

Working...
X