Announcement

Collapse
No announcement yet.

NVIDIA & Co Continue Working On LLVM Fortran "Flang" Compiler

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • NVIDIA & Co Continue Working On LLVM Fortran "Flang" Compiler

    Phoronix: NVIDIA & Co Continue Working On LLVM Fortran "Flang" Compiler

    Since earlier this year NVIDIA posted their work on "Flang", an LLVM-based Fortran compiler, to GitHub while now they have done a formal announcement and update about its status...

    http://www.phoronix.com/scan.php?pag...-Update-Issued

  • #2
    Good to hear. Maybe they'll implement the aliasing optimizations LLVM core currently lacks which benefit both Fortran and Rust but not C/C++.

    Comment


    • #3
      Is there really that much demand for Fortran in the LLVM world?

      I ask in all seriousness because generally the LLVM community is forward looking.

      Comment


      • #4
        Originally posted by wizard69 View Post
        Is there really that much demand for Fortran in the LLVM world?

        I ask in all seriousness because generally the LLVM community is forward looking.
        Latest C and C++ versions are nothing but small evolutionary steps for BCPL class of languages. It's really starting to annoy me how people refuse to see the truth behind the names. First C version was very similar to B or BCPL. It didn't even enforce type signatures for function arguments. It wasn't 64-bit. Hardly anything modern C11 coders think of. Latest Fortran is nowhere near FORTRAN (1957). Did you know Fortran has object oriented features, parallel programming features, generics etc. ? How is it less evolved than C which does not have any of these?

        Comment


        • #5
          Originally posted by wizard69 View Post
          Is there really that much demand for Fortran in the LLVM world?

          I ask in all seriousness because generally the LLVM community is forward looking.
          Fortran is still big in number crunching, it's not going anywhere. You can be both forward looking and Fortran supporting, not everything that's old is also obsolete.
          See: https://arstechnica.com/science/2014...950s-behemoth/

          Comment


          • #6
            Originally posted by bug77 View Post

            Fortran is still big in number crunching, it's not going anywhere. You can be both forward looking and Fortran supporting, not everything that's old is also obsolete.
            See: https://arstechnica.com/science/2014...950s-behemoth/
            Apparently we should also abandon mathematics and electronics as the base of computer science since these sciences are so old. I wonder why everyone on this forum handles discussion in english while lojban is much more recent attempt at language design?

            Comment


            • #7
              Originally posted by caligula View Post

              Apparently we should also abandon mathematics and electronics as the base of computer science since these sciences are so old. I wonder why everyone on this forum handles discussion in english while lojban is much more recent attempt at language design?
              Well, maths is rather cute, I wouldn't throw it away.
              What needs to be done is do away with theorems that are over 100 years old and make room for innovation. Can you say Facebook/Instagram Maths?

              Comment


              • #8
                Originally posted by wizard69 View Post
                Is there really that much demand for Fortran in the LLVM world?

                I ask in all seriousness because generally the LLVM community is forward looking.
                Fortran is mostly aimed at running complex calculations, in its role easily pwns C and C++. It's kind of niche language, but it's not really endangered nor obsolete.

                Comment


                • #9
                  Originally posted by wizard69 View Post
                  Is there really that much demand for Fortran in the LLVM world?

                  I ask in all seriousness because generally the LLVM community is forward looking.
                  Fortran is an interesting language for number crunching (as other's have mentioned), but for nvidia this is about being able to compile fortran to run on their GPU's. Imagine fortran code designed to run on a massive cluster of machines with multi-core, multi-socket CPUs that could be run on a single machine with 4 high end gforce cards (which would probably be massively more parallel then that room full of racks of computers. I think you can understand the appeal to scientists of not having to rewrite code, just recompile, to run on exponentially smaller and cheaper hardware.

                  Comment


                  • #10
                    Originally posted by bug77 View Post

                    Well, maths is rather cute, I wouldn't throw it away.
                    What needs to be done is do away with theorems that are over 100 years old and make room for innovation. Can you say Facebook/Instagram Maths?
                    Innovation doesn't come out of nowhere. I don't know what do you mean by "say something maths", but Facebook/Instagram is heavily relying on maths.

                    Comment

                    Working...
                    X