Announcement

Collapse
No announcement yet.

Making The Case For Using Rust At Low Levels On Linux Systems

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by cl333r View Post

    I moved from Gtk3 to Qt5 after dealing with it for like 2 years and finding out it's crap and at the time poorly supported on windows (at the time). The shittiest part is that it's written in C which is too low level, Python is a hog because it's interpreted that's one of the reasons Canonical couldn't keep up its software center because Python like any interpreted language only scales so much. And the C++ bindings to Gtk3 are crappy because they're still a wrapper around C logic with some bit of OO design. Anyone who tells me C++ isn't good for graphical apps, or any iteration of this idea, is only telling me he's not worth my time.
    C++ isn't good for anything. There's good reason why everyone is moving away from C++ and using other languages instead, and why both Google and Mozilla decided to attempt to ditch their usage of C++ for Go and Rust respectively. C++ really isn't good for software libraries either due to C++ not having a stable exportable interface. That's a good reason why everyone writes their software libraries in C, and why all languages implement a C ABI as the foreign interface of choice. Rust, likewise, has the ability to export Rust libraries with a C interface, and many languages are now starting to take advantage of that, including Python and Ruby.

    Comment


    • #52
      Originally posted by totoz View Post
      The verbosity of Ada is intentional: since the language aimed to be used for safety critical systems, code readability and clarity was very important, even for some newcomers or people not familiar with the language (but with some IT skills, though). When I see modern C++, it's just klingon for non C++ developers. Even for me, although I did program in C++ for years a decade ago.

      About tools, there's an IDE (GNAT Programming Studio) and an Eclipse plugin that you can get for free (GPL). And of course, Emacs ;-)
      Although I do agree about C++ being unreadable for no reason when templates are come in place, I'd say that «verbosity» have little to do with code readability. More over, C++ with templates is verbose, too verbose!

      The most readable language from ones that I know, in my opinion, is Haskell (so called ML-syntax), and it by no means is verbose, the code is very short. I even have for a year in mind the idea of creating a parser for a text editor, which would translate on the fly parts of code in C or C++ or C# or whatever, into Haskell like syntax look — i.e. the real code remains unchanged, it's just the reader sees it like this (the idea though not even close to its realization for a number of different reasons).

      I'm not saying anything against Ada though, as I don't know this language.

      Comment


      • #53
        Originally posted by Hi-Angel View Post
        Although I do agree about C++ being unreadable for no reason when templates are come in place, I'd say that «verbosity» have little to do with code readability. More over, C++ with templates is verbose, too verbose!

        The most readable language from ones that I know, in my opinion, is Haskell (so called ML-syntax), and it by no means is verbose, the code is very short. I even have for a year in mind the idea of creating a parser for a text editor, which would translate on the fly parts of code in C or C++ or C# or whatever, into Haskell like syntax look — i.e. the real code remains unchanged, it's just the reader sees it like this (the idea though not even close to its realization for a number of different reasons).

        I'm not saying anything against Ada though, as I don't know this language.
        In case of Ada, it push the "readability" up to that by just reading the code out loud, even a non-Ada developer can understand the algorithm.

        Comment


        • #54
          Originally posted by atomsymbol View Post

          Immutable data structures are an efficient computational model of just a restricted subset of phenomena of the world.
          I sympathize with folks' potential feelings of frustration over generally widespread misunderstanding of functional programming (primarily Haskell) and purity (though personally I find laziness to be more annoying). Still, I find the above tautological and perhaps a bit too reductionist---sure efficient solutions are efficient solution, but what is the class of problems efficiently solved with persistent data structures? Prior to Okasaki's aggregation/work I'm sure most of us thought far fewer problems could be efficiently tackled than exist.

          That being said, I take the sentiment. Working with the prototypical non-inductive data structures like graphs and matrices is a huge pain without mutation.
          Last edited by codensity; 06-14-2016, 05:08 AM. Reason: spelling

          Comment


          • #55
            Originally posted by codensity View Post
            I sympathize with folks' potential feelings of frustration over generally widespread misunderstanding of functional programming (primarily Haskell) and purity (though personally I find laziness to be more annoying). Still, I find the above tautological and perhaps a bit too reductionist---sure efficient solutions are efficient solution, but what is the class of problems efficiently solved with persistent data structures? Prior to Okasaki's aggregation/work I'm sure most of us thought far fewer problems could be efficiently tackled than exist.

            That being said, I take the sentiment. Working with the prototypical non-inductive data structures like graphs and matrices is a huge pain without mutation.
            In a sense, mutability is "native" also to functional programming languages because it is possible to implement a Turing machine in a functional programming language.

            A problem to solve is how create a compiler that can recognize the presence of Turing machines in functional programs and thus can generate efficient assembly code for such cases.

            Comment


            • #56
              Originally posted by mmstick View Post

              C++ isn't good for anything. There's good reason why everyone is moving away from C++ and using other languages instead, and why both Google and Mozilla decided to attempt to ditch their usage of C++ for Go and Rust respectively. C++ really isn't good for software libraries either due to C++ not having a stable exportable interface. That's a good reason why everyone writes their software libraries in C, and why all languages implement a C ABI as the foreign interface of choice. Rust, likewise, has the ability to export Rust libraries with a C interface, and many languages are now starting to take advantage of that, including Python and Ruby.
              I think this is too strongly worded. I really like Rust so far (I'm an avid fper) but faulting C++ for lack of library support on one hand and praising Rust's ffi on the other is inconsistent. Rust largely copied the C++ approach to exporting functions and data, after all neither have an abi (however unlike Rust, C++ has proposals).

              My biggest current complaint with Rust is that library support (Rust to Rust) is currently bad and as far as I can tell, not considered a problem (see: https://github.com/rust-lang/rust/issues/16402 and "non-goals" https://internals.rust-lang.org/t/pe...-the-plan/2767). Fetching the world and statically linking everything is such a boring solution. Imagine if libc were written in a way that required the machine to be rebuilt on update.

              Cargo is a particularly badly designed, hyper-coupled tool requiring a python script (that appears to reimplement much of cargo, https://github.com/dhuseby/cargo-bootstrap) to bootstrap. As a package manager myself I'm not quite sure how to cope with rust software---some suggestions strike me as crazy, e.g.
              In Debian's packaging solution for Rust they are not planning on installing Rust libraries globally, only Rust applications, so will not hit this problem. They will be packaging Rust libraries as source to a custom system location. When their package Rust applications are built, they will redirect cargo to look in that location for the source code to their dependencies, which will then be built and statically linked to the application. But the libraries will never be installed anywhere that could cause such a conflict.
              Not even the rust compiler itself can be built if it's already installed system-wide.

              In any case, if you want to look deeper into issues around packaging, at least there's work being done to address some issues (though personally it doesn't satisfy me): https://internals.rust-lang.org/t/pe...-the-plan/2767

              I understand that to get really high-performance specialization as seen in C++ header libraries you're effectively statically linking but I feel like the community at large gives much thought to libraries and abi (e.g. pimpl).

              Comment


              • #57
                Originally posted by atomsymbol View Post

                In a sense, mutability is "native" also to functional programming languages because it is possible to implement a Turing machine in a functional programming language.

                A problem to solve is how create a compiler that can recognize the presence of Turing machines in functional programs and thus can generate efficient assembly code for such cases.
                Sure, the notion of what it means to be purely functional is widely debated (hence Conal's often cited description of C as purely functional). Sometimes mutability and pointers are valuable for expressing an algorithm in which case having something functionally equivalent is fine, however I more often find that it's valuable for making something efficient. In the latter case I want a pointer into ram. I agree that it would be nice if we didn't need ST in Haskell or if we could somehow make all operations on immutable arrays efficient on von neumann machines (perhaps something like a linearity analysis is a good start). If that were the case however, could we not argue that immutable data structures are an efficient computational model of all phenomena of the world [with respect to mutable data structures]?

                Comment


                • #58
                  Originally posted by liam View Post

                  This is a terrific account of possible reasons for the, apparent, success of rust.
                  You mention the CoC (and if add its actual enforcement) but I think, in one case, at least, its attempt to fit everyone in a square slot (I realize civility shouldn't be viewed as such but it's an imperfect world) led to the loss of a really brilliant developer/contributor. For people like that, imho, you should provide an alternate* slot.

                  *obviously, even for them, there are limits, but rudeness should be tolerated to degree that the person contributes
                  Are you suggesting we grant safespaces to people opposed to safespaces?

                  Comment


                  • #59
                    Originally posted by codensity View Post
                    Sure, the notion of what it means to be purely functional is widely debated (hence Conal's often cited description of C as purely functional). Sometimes mutability and pointers are valuable for expressing an algorithm in which case having something functionally equivalent is fine, however I more often find that it's valuable for making something efficient. In the latter case I want a pointer into ram. I agree that it would be nice if we didn't need ST in Haskell or if we could somehow make all operations on immutable arrays efficient on von neumann machines (perhaps something like a linearity analysis is a good start). If that were the case however, could we not argue that immutable data structures are an efficient computational model of all phenomena of the world [with respect to mutable data structures]?
                    I think the answer depends on how long it would take for the compiler to convert the functional source code to imperative code. If it was negligible, such as 10% of total compile time, then, yes, we would consider immutable data to be as efficient as mutable data for representing world phenomena. But this is just a speculation.

                    On Linux, most of the software isn't implemented in a functional programming language. One might argue that this proves it is generally harder for people to express an idea in the functional paradigm than to express an equivalent idea in the imperative paradigm. If there existed, for example, a major web browser with 90% of code implemented in Haskell then it would be quite easy to decide whether efficient functional programs take longer time to write.

                    Comment


                    • #60
                      Originally posted by unixfan2001 View Post

                      Are you suggesting we grant safespaces to people opposed to safespaces?
                      In a limited sense, yes.

                      Comment

                      Working...
                      X