Announcement

Collapse
No announcement yet.

Rust GCC Code Generator "rustc_codegen_gcc" Can Now Bootstrap Rustc

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by arQon View Post
    If you want everybody *else* though to change their code to produce Rust/etc headers *for you* though, that takes you squarely back to the impractical and untenable position of demanding that other people do work on your behalf. There is no sane world in which that works, so your next best option is to demand that other people do *less* work on your behalf by generating this XML (or whatever) mess instead, which in turn demands that GCC do more work on your behalf by adding support for XML headers rather than C ones, and I don't that as a viable plan either, so...
    It was never about "everyone else" producing "Rust headers". It's about "now that a language other than C or C++ is getting more uptake in the same niche, people are starting to realize what a pain it is to point at all the quirks of whatever the platform's default C compiler is and say 'C headers are the IDL' and 'The ABI is whatever the C compiler does' when you have two or more languages that aren't C that want to communicate with each other".

    It's not a "do work for me". Gankra's post is more of a "Hey, I'm doing this work and I'm discovering how deep the rabbit hole goes (eg. the part of the post about writing abi-checker to find mistakes in rustc and discovering that things like GCC and LLVM don't match in places like 128-bit integers where they say they should) and, at the same time, this C guy is making worrying noises about 'To save C, we must change ABI things like the size of intmax_t' . Is there anyone else out there in a similar position who wants to collaborate with me on an interoperable alternative with compliance-oriented features that have been planned rather than grown?"

    Comment


    • Originally posted by NobodyXu View Post
      Yes, that's true, but sometimes it's still reasonable to use another PL.


      Gcc is not one programming language. C and C++ is mandated that you can bootstrap with under gcc design. It has been allowed for a long time to build libraries and other things in a program language after you have got to the bootstrap level. So gcc with mrust level compiler could then go on to build like a rust plugin to add more functionality.

      Originally posted by NobodyXu View Post
      That depends on what they are sharing, usually they are vastly different.

      If what they are sharing is basic types in a compiler, then it should be put in a shared library, like libLLVM.so
      If what they are sharing is some parallel programming utilities, then that should be put in another library.

      Not everything should be put into one (static/shared) library, code with different purposes should be built into separate libraries then link together, with static or dynamic linking.
      Gcc is already collection of libraries.

      Originally posted by NobodyXu View Post
      ​Thanks for pointing that out, but my point stands still: C/C++ macro are vastly different from Rust's sanitized `macro_rules!`.
      This difference is not as clear as if first seams.

      Originally posted by NobodyXu View Post
      ​Yes it's not a new thing, but it does not change anything.
      C++ uses a duck typing template system despite that they now adds concept checking to improve error message and it also has an OOP inheritance model.
      While Rust unifies both and has a single trait system.

      THis is not something new, but Rust is still different from C++, thus their internal logic is vastly different.
      Except there are a few problems here in your logic. You are only looking the C++/C standard. You have not looked at static analysis tools for C and C++.

      Originally posted by NobodyXu View Post
      ​​They are indeed incompatible.

      C++ is using duck typing + OOP, they cannot just switch to trait based system without breaking all existing code.
      https://en.wikipedia.org/wiki/Sparse Lets say you use Linux kernel sparse with are normal C program its going to error you out for lacking sparse data.
      Static analysis tools around C and C++ are allow to break all existing programs in C or C++. Different static analysis tools for C and C++ already add trait systems very much like the rust one. Yes checking something like rust traits on and C or C++ program would have to be behind optional flag because programs without the extra __attribute__(​) items in code base would be required to error out.

      Something that not normally considered how how Linux sparse and other static analysis tools are able to add extra meta data then build with normal C/C++ compiler. It simple you can add extra metadata/trait information that will be ignored if compiler/static analysis tool does not support it.

      Of course there is a fun horrible issue here different Static analysis tools for C and C++ add their own trait system to C and C++ with no uniform standard for it. Yes rust could have decide to have means to put rust trait information into C/C++ source and header files.


      Originally posted by NobodyXu View Post
      ​​Turns out that gfortran and gcc are two separate programs.


      gfortran does appear as its own program on disc but it from the gcc source tree and uses the gcc core libraries. Its part of the gcc monolithic development that when they build and run the automated testsuite what is made by gfortran is checked that it matchs up.

      Originally posted by NobodyXu View Post
      ​​​Even if they are written in different PLs, they can share the same library, by providing a set of C/C++ APIs.
      There is a problem here. Rust with it current way of interfacing with C/C++ this comes unsafe interface this result if you want your core program safe rust you have to have extra layer of wrappers around the C/C++ to declare the extra information to result in a safe C/C++ interface.

      Writing in different program languages and using current C/C++ as is results in problems where you are not using the other program language correctly or are having maintain wrapper code. Of course C and C++ .static analysis tool normally don't run though the other programming languages.

      Provide set of C/C++ APIs leads you back to the FFI problem that is not solved correctly.

      Originally posted by NobodyXu View Post
      ​​What does that have to do with de-coupling and modular?
      You can still share the function/code while being modular and de-coupled.

      Besides, not everything can be shared.
      Gcc is modular but its not decoupled development. Decoupled has it fair share of issues sharing functional code. The problem with decoupled development and sharing code is what happens when a shared function changes. Also how does a developer know how a function is being used. The monolithic/coupled route has some advantages here. Working out what code should be shared is also simpler if you don't decouple.

      ​You are correct not everything should be shared. The problem here you do want to share as much as you can and as much as you should.

      The reality here most language that have appeared attempt to be alternatives to C and C++ make the same mistake of treating the C/C++ language to their language as a solid divide. They don't see that due to C and C++ language rules nothing forbids putting extra information on the C/C++ side in the C/C++ headers and source files and even more that nothing forbids the language from applying their languages static analysis to C or C++.

      Think about this there is no reason why their could not be safe C for rust programs. Yes safe C mandating extra meta data added to the C source files.

      Rust has been taking a decoupled interactions with C/C++. Now if rust development took the coupled route with C/C++ where rust had a safe C/C++ that had to have the extra traits data that Rust static analysis could be applied could possible allow getting to program with more area of code equal to rust safe code without recoding as much.

      There is serous need to look at should this area of development be coupled. The reality here is de-coupled and coupled development both have their advantages. One of the problems with total decoupled development is it comes really simple to miss where you should be sharing code/coupling up. Its also harder to see in total decoupled development where you should be shades of grey. Putting rust trait information into C/C++ source and header files would be s shade of grey area where you are doing something that is not 100 percent decoupled or coupled.

      Comment


      • Time is getting tighter by the day. I'm sure you'll appreciate the resultant brevity, but it does mean I won't have time to get into the more interesting side of things.

        Originally posted by ssokolow View Post
        It's about "'The ABI is whatever the C compiler does' when you have two or more languages that aren't C that want to communicate with each other".
        Well, despite having finally fought my way past the nomenclature problem, I *still* don't really see this as one, and that's without even making jokes about how you'd need a second non-C language that actually matters in the first place. :P

        Seriously though, this entire premise is simply not even "real". There have already been periods where numerous languages all held "first class citizen" positions: there was a time when BASIC and Pascal were in far more common use than C was, and BASIC even remained so for 20 years after Windows took over the world.
        I understand education can't make up for experience, but - like VMs, containers, coroutines, and far too many other things - the ignorance of anything that happened more than 10 years ago in this industry is just stunning at times. No other field consistently discards knowledge like this one does, only to proudly "invent" it again decades later. But I digress.

        "Oh no, we have to parse C headers too" isn't outright a pretense, but yeah: you do. Because every kernel in the world is written in C/++, and 99.999% of libraries in the world are also written in C/++. Calling out the 0.001% edge case as the main problem here, is entirely backwards.

        That aside, there's still nothing stopping anyone from interfacing ("communicating" is absolutely the wrong term to use in this context) any two languages that they're capable of writing compilers for in any way they see fit - other than the sort of anxiety-driven paralysis that people tend to trap themselves in with infinite "but what if..." questions whenever they try to do something new. Ask for advice, no question. Ask for collaborators, to further advise and lighten the implementation load. Look around, and do some research - but there comes a point where the only way forward is to actually *do* something. Even if you get it wrong the first time, which you probably will.

        Short term, the best approach is probably something along the lines of extern __gcc12 etc. If the GCC crew are already hostile to the idea, or at best apathetic, that protects you against *source*-level breakage further down the road, which is far more important than ABI breakage is.

        The only real surprise in any of this is that it's arguably one of the few cases where XML actually *is* pretty close to being the right tool for the job.

        Comment


        • Originally posted by arQon View Post
          Oh no, we have to parse C headers too" isn't outright a pretense, but yeah: you do. Because every kernel in the world is written in C/++, and 99.999% of libraries in the world are also written in C/++. Calling out the 0.001% edge case as the main problem here, is entirely backwards.


          Every is a dangerous word. There are OS kernels not written in C or C++. Those that are still growing market share do have C interface support of some form. Yes redox is kernel is written in rust and there are others written 100 percent in assembly and of course Cosmos in a form of .net.


          Originally posted by arQon View Post
          Seriously though, this entire premise is simply not even "real". There have already been periods where numerous languages all held "first class citizen" positions: there was a time when BASIC and Pascal were in far more common use than C was, and BASIC even remained so for 20 years after Windows took over the world.
          ​There is marketing and reality.


          Only 3 languages have ever held number 1 position on the tiobe index that C, Java and Python. Basic at it peak got to number 3 and is currently at 6 with the visual basic form. Pascal is another one that peaked at 3 but has basically fallen to oblivion in usage.

          The reality here is Basic and Pascal have never been more common in usage than C. Also Basic has never got above C++ either in usage either.

          Python and Java do have claims to being more common than C at some points in their history. C is at its lowest point in almost it complete history now. Of course I don't see anyone writing operating systems in python any time soon.

          Comment


          • Originally posted by oiaohm View Post
            ​There is marketing and reality.


            Only 3 languages have ever held number 1 position on the tiobe index that C, Java and Python. Basic at it peak got to number 3 and is currently at 6 with the visual basic form. Pascal is another one that peaked at 3 but has basically fallen to oblivion in usage.

            The reality here is Basic and Pascal have never been more common in usage than C. Also Basic has never got above C++ either in usage either.

            Python and Java do have claims to being more common than C at some points in their history. C is at its lowest point in almost it complete history now. Of course I don't see anyone writing operating systems in python any time soon.
            I think they were referring to how, in the 8-bit micro era, the BASIC line editor was the OS and, on early Macintoshes, Pascal served the role C does on UNIX.

            Comment


            • Originally posted by ssokolow View Post
              I think they were referring to how, in the 8-bit micro era, the BASIC line editor was the OS and, on early Macintoshes, Pascal served the role C does on UNIX.
              The tiobe index covers the time frame of the 8-bit micro era and the early Macintoshes.


              The 8-bit micro C was around little bit more than most think. Data Becker as noted here on the C64. There are a lot of old games for the 8 bit micro that turn out to be coded in C.
              https://macgui.com/news/article.php?t=537 Early Macintoshes C was there very early as well.

              That 1980s time frame Basic and Pascal was in people faces a lot. Yes the 1980s into early 1990s is when Pascal and Basic has their highest market shares lot of people incorrectly presume in that time frame that Pascal and Basic got higher than C in market share in reality that never happened.

              Its a surprise to a lot of people for how well known Basic and Pascal got at the peek usage neither end up more used C . Even more scary if you add Basic and Pascal peek market shares up as single number in those time frame that single number still does not pass C market share in the 1980s/1990s

              .ssokolow this is one of the history fun things. Computer world equal of the silent majority. Basically C was a very dominate programming language with very little advertisement. Yes large number of applications for 8-bit micro and Macintosh computers and others were written in C so we have a true case of silent majority and it documented in history by number tiobe has collected and by people who have gone though documenting like what program language all archived c64 programs were written in and so on.

              C has been very dominantly used programming language for a very long time. Lot of the force even in the 1980s that was behind C was existing code bases that developers could use to avoid having redo everything. Notice java and python that both have been able to beat C at different times have had the same advantage.






              Comment


              • Originally posted by oiaohm View Post
                The tiobe index covers the time frame of the 8-bit micro era and the early Macintoshes.


                [...]
                I never intended to make an argument about BASIC and Pascal outnumbering C... just an attempt to clarify what arQon probably meant.

                It's a fact that, in the early days of the mac, Pascal served the role C does on POSIX... there just weren't very many macs compared to other mainstream platforms due to the market segment they priced themselves into.

                Comment


                • Originally posted by ssokolow View Post

                  I never intended to make an argument about BASIC and Pascal outnumbering C... just an attempt to clarify what arQon probably meant.

                  It's a fact that, in the early days of the mac, Pascal served the role C does on POSIX... there just weren't very many macs compared to other mainstream platforms due to the market segment they priced themselves into.


                  ssokolow early days of Macintosh​ C was there. Yes that write up details the history. The early C on macs had to be Pascal compatible because Pascal was the platform language in quite a few places. Development stage of the apple Macintosh​ as in 1983 before being released to the public third party developers to apple had already started making the C compilers. So by the Jan 1984 release of the Apple Macintosh there were already C compilers on Apple Macintosh.

                  Pascal usage on even the Apple Macintosh did not end up outnumbering C usage.

                  C compilers had appeared on the Apple II yes in 1977 this is 2 years before Apple Pascal that is 1979. C got a 2 year head start on the Apple platforms head of Pascal. This results in existing code base problem.

                  There is only one Apple computer that did not have C as development option at it release that the Apple I. Yes the apple I was assembly language at release not basic and you had to assemble it yourself.

                  Reality here even if you restrict yourself to only early Apple Macintosh software C is still more dominantly used than pascal. Interesting point C standard today contains features that were required for pascal compatibility on the Apple Macintosh.

                  Comment


                  • Originally posted by oiaohm View Post
                    The early C on macs had to be Pascal compatible because Pascal was the platform language in quite a few places.
                    And there you have it. You just confirmed that you acknowledged the truth of the point I was making.

                    Comment


                    • Originally posted by ssokolow View Post
                      just an attempt to clarify what arQon probably meant.
                      I didn't realize "first class citizen" needed clarifying at all - and it didn't, until oiaohm did what oiaohm does, which is to take a piece of someone's comment and twist it into something that was never said so that they can argue with themselves about how the newly-fabricated statement was wrong. I generally find it's best to just leave them to it.

                      Comment

                      Working...
                      X