Announcement

Collapse
No announcement yet.

FLANG: NVIDIA Brings Fortran To LLVM

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • FLANG: NVIDIA Brings Fortran To LLVM

    Phoronix: FLANG: NVIDIA Brings Fortran To LLVM

    Flang is to Fortran as Clang is to C/C++...

    http://www.phoronix.com/scan.php?pag...-Fortran-Flang

  • #2
    Fortran? Really? The three people who still use that are enough to justify this?

    Comment


    • #3
      As far as I'm aware most people who still use Fortran are old farts who work in university physics departments (I've personally run into a few of them and I've heard that there's a lot of them all over the world). Physics is one of the major users of high performance compute so it's only natural that this would be added at some point. Nvidia's version of GCC, the "imaginatively" named NVCC, has had support for Fortran for as long as I've known about it, so I'd say that the only thing resembling a surprise here is that Nvidia would be supporting a competing compiler.
      "Why should I want to make anything up? Life's bad enough as it is without wanting to invent any more of it."

      Comment


      • #4
        Originally posted by droidhacker View Post
        Fortran? Really? The three people who still use that are enough to justify this?
        Kudos for the demonstration in ignorance.

        Fortran is still very big. Especially in areas of science where codebases are used that have been in continuous development since the 1960s.

        Comment


        • #5
          Originally posted by droidhacker View Post
          Fortran? Really? The three people who still use that are enough to justify this?
          It seems that you don't realize how important is this language in HPC/Science (earth science, climate, astronomy and many others). It has been designed quite a long time ago, but like the wheels on your car it's damn good at what is it used for: computation. This is why Intel, AMD, Nvidia are spending ressources for this language.

          Back to the news, this is a good step from nvidia, PGI compilers are pretty messy unlike Intel's.

          Comment


          • #6
            Originally posted by droidhacker View Post
            Fortran? Really? The three people who still use that are enough to justify this?
            Fortran is extremely common in HPC. The language regularly receives updates, and for array programming is one of the very best options available. Intel, IBM, and Nvidia all continue to put significant resources into their fast Fortran compilers.

            Comment


            • #7
              Having worked with some "old farts" at universities, the only real argument FOR Fortran that I have come across is the usual one for any legacy product. They don't have the time or motivation to port their code to C (or something else). The argument that Fortran is inherently better for these kind of scientific applications doesn't really convince me, as I have seen no real example that demonstrates this.
              It is nice to see any kind of open source effort on this front. But in the long run, I think it would be better if the applications would be ported to a language that doesn't discourage new developers to contribute.

              Comment


              • #8
                Originally posted by droidhacker View Post
                Fortran? Really? The three people who still use that are enough to justify this?
                Such deep ignorance is only paralleled by Pawlerson.

                Comment


                • #9
                  I've worked (2012-2014) on a military project once that used Fortran heavily. Some of the code dated back to the 80's but it's still maintained today. We had ~50 people working with the code and they were of all ages (20's - 60's).

                  Comment


                  • #10
                    A more concrete example of Fortran still being useful is python's scipy.odeint, a wrapper for the Fortran-written library ODEPACK. If you've ever needed to solve an ode in python you'll probably have used it.

                    Comment

                    Working...
                    X