Announcement

Collapse
No announcement yet.

Even Apple Is Interested In Migrating Their C Code To Rust

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • kpedersen
    replied
    Originally posted by cynical View Post

    One of the main benefits of Java is not having to interact directly with hardware. If you are having to use JNI then yes you are now forgoing memory safety and entering the dangerous land of C, in which case an experienced and disciplined C developer is definitely going to do better than you..
    Well that is kind of the problem. Almost every Java project I have partaken in has involved the JNI. For example OpenGL, using high performance streaming middleware; Android JNI to access hardware. Unless you treat Java like a non-extensible scripting language (like Awk), you will need the JNI.

    Yes, you may not be interacting with the hardware directly; but someone has had to. Too many developers overlook this JNI usage for almost every project because they just grab a binding dependency. However as soon as they need to maintain it themselves (i.e the binding has become obsolete) then they completely lack the skill and manpower to do so and the project collapses.

    Based on my experience, I am personally of the opinion that writing bindings to connect Java to native libraries, is harder and more error prone than just using a native language for the whole project directly in the long run. This is not a common belief for many Java developers because they have yet to experience the joys of maintaining a binding. They usually just whine if the binding breaks and get someone else to do the hard work.

    (Obviously by Java I also include VB.NET and Csharp.NET in this since they are the same technology).

    Leave a comment:


  • ssokolow
    replied
    Originally posted by kpedersen View Post
    Using the term "error free" is naive. However I am confident that disciplined C developers adhering to good standards like MISRA will develop software which has less memory errors than Java or C# when dealing with something like OpenGL or device drivers.
    Of course, that assumes you actually get programmers to follow them:

    Toyota Unintended Acceleration and the Big Bowl of “Spaghetti” Code

    Koopman was highly critical of Toyota’s computer engineering process. The accepted, albeit voluntary, industry coding standards were first set by Motor Industry Software Reliability Association (MISRA) in 1995. Accompanying these rules is an industry metric, which equates broken rules with the introduction of a number of software bugs: For every 30 rule violations, you can expect on average three minor bugs and one major bug. Toyota made a critical mistake in declining to follow those standards, he said.

    When NASA software engineers evaluated parts of Toyota’s source code during their NHTSA contracted review in 2010, they checked 35 of the MISRA-C rules against the parts of the Toyota source to which they had access and found 7,134 violations. Barr checked the source code against MISRA’s 2004 edition and found 81,514 violations.

    Toyota substituted its own process, which had little overlap with the industry standard. Even so, Toyota’s programmers often broke their own rules. And they failed to keep adequate track of their departures from those rules – and the justification for doing so, which is also standard practice. Koopman testified that if safety is not baked into the recipe in the process of creating the product, it cannot be added later.

    Leave a comment:


  • duby229
    replied
    Originally posted by cynical View Post

    It's automatic memory management.
    Automatic memory management (which I think is an asinine term, it's not actually true) is more than a garbage collector.

    Leave a comment:


  • cynical
    replied
    Originally posted by duby229 View Post

    GC is -NOT- memory management...
    It's automatic memory management.

    Leave a comment:


  • cynical
    replied
    Originally posted by kpedersen View Post

    Using the term "error free" is naive. However I am confident that disciplined C developers adhering to good standards like MISRA will develop software which has less memory errors than Java or C# when dealing with something like OpenGL or device drivers.

    Remember in Java / .NET when accessing hardware, you still have to use raw memory correctly. You also have to "Pin" the GC memory so that it doesn't clear it. You still have to manually free memory that is no longer used and that the GC does not know about.

    But of course the typical managed developer will just grab a (probably obsolete) binding and blame someone else when their project turns to mush. Been there done that, I have worked with Unity / C# developers before and they are the worst XD.

    C and C++ also have 3rd party garbage collectors asan and Valgrind. Debugging tools for raw memory in Java or .NET are very primitive because that is not a common usecase for these technologies.
    One of the main benefits of Java is not having to interact directly with hardware. If you are having to use JNI then yes you are now forgoing memory safety and entering the dangerous land of C, in which case an experienced and disciplined C developer is definitely going to do better than you. My point is not to bash C, as Java is not suitable for the areas where C is popular. I'm just saying that people need to be aware of their own fallibility as humans and have some respect for C as the sharp tool that it is. If the most skilled C/C++ developers (the JVM guy I was talking about is Cliff Click, and he definitely knows what he's doing) make mistakes then it can happen to anyone.

    Leave a comment:


  • duby229
    replied
    Originally posted by kravemir View Post

    Yep, Rust is awesome regarding performance. Probably the fastest language, which provides safe (no dangling pointer) memory access guarantee, plus extra, provides no data race guarantee. And, Rust has got also great syntax and semantics.

    Yet, Rust doesn't replace C#/Java, which have fully automatic memory management, which requires (almost) zero programmer's assistance. In that genre of languages, golang (when/if/after it gets generics) is the future.
    GC is -NOT- memory management...

    Leave a comment:


  • Luke_Wolf
    replied
    Originally posted by jabl View Post
    Someone much more experienced with Rust than me once wrote that Rust development is, for him, maybe 20-30% slower than Ruby, which he also was fairly familiar with. Rust certainly has a learning curve, and more boilerplate than a high-level dynamic language like Ruby, but the thing is, the compiler helps catch so many mistakes you'd otherwise spend time chasing down manually.
    So admittedly my experience with Rust is mostly hobby projects, but personally I don't really feel slower writing in Rust than in any of the other big iron languages. Actually I feel faster writing Rust than C++ (part of which is CMake (And the build system story in general for C and C++) is a pain in the ass ), and at least as fast as writing Java. A bit slower than C# but especially with the advent of rust-analyzer in place of the rust language server I don't feel like I'm losing out on that much in comparison. I wouldn't be surprised if at a larger scale than I currently write it in if it isn't much faster to write in Rust than even C# because of the guarantees it makes about the code.

    Leave a comment:


  • jabl
    replied
    Originally posted by kravemir View Post

    Rust answers, and solves, issues and bugs based on data-race(s).
    Not only data races, but also memory safety which matters for all programs, not only multi-threaded ones.

    Though, it might be an overkill, it is currently the most right answer, for mission-critical infrastructure for a trillion dollar company. Rather make development 5-times slower, than have possibility of tiny-hard-to-see human error, which would have catastrophic effects later.
    Someone much more experienced with Rust than me once wrote that Rust development is, for him, maybe 20-30% slower than Ruby, which he also was fairly familiar with. Rust certainly has a learning curve, and more boilerplate than a high-level dynamic language like Ruby, but the thing is, the compiler helps catch so many mistakes you'd otherwise spend time chasing down manually.

    Leave a comment:


  • Guest
    Guest replied
    Originally posted by wizard69 View Post

    I'm certain many feel this way but the real question is, is Rust the right answer? I just don't know and honestly Rust never had the magnetism for me, that others seem to see in it.
    Rust answers, and solves, issues and bugs based on data-race(s). Though, it might be an overkill, it is currently the most right answer, for mission-critical infrastructure for a trillion dollar company. Rather make development 5-times slower, than have possibility of tiny-hard-to-see human error, which would have catastrophic effects later.

    Rust has (currently) zero-none-zero-none magnetism for me, either. But, that doesn't invalidate the fact, that Rust is currently the best tool for specified job/area.

    Leave a comment:


  • johanb
    replied
    Originally posted by kpedersen View Post
    If Rust could directly consume C headers and link against C libraries; it would be a game changer.
    Try ziglang

    Leave a comment:

Working...
X