Announcement

Collapse
No announcement yet.

FLANG: NVIDIA Brings Fortran To LLVM

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by speculatrix View Post
    I think Nvidia are trying to win the $55,000 prize in the NASA Fun3D challenge:

    https://www.nasa.gov/aero/nasa-issue...rcomputer-code
    If the do it, they are doing it for prestige alone. 55k dollars is a rounding error in NVIDIA's revenue.

    Comment


    • #22
      Originally posted by starshipeleven View Post
      If the do it, they are doing it for prestige alone. 55k dollars is a rounding error in NVIDIA's revenue.
      Can you write a C++ (and extra libraries) code to prove your revenue claim?
      Last edited by toguro123; 18 May 2017, 07:44 PM.

      Comment


      • #23
        Originally posted by toguro123 View Post
        Can you write a C++ (and extra libraries) to prove your claim?
        Assuming you wanted to quote the other post as here it makes no sense.

        I can write the wrapper/helper libraries to avoid the issues you mentioned, anyone with basic programming knowledge can. I'm not good enough to write high-performance math libraries, or anything high-performance in general.

        But you didn't ask high performance so I think my claim is proven.


        EDIT: NVIDIA's revenue is like 1-2 billions per quarter https://ycharts.com/companies/NVDA/revenues so I'm pretty confident that 55k dollars is within rounding error for them.
        Last edited by starshipeleven; 18 May 2017, 07:51 PM.

        Comment


        • #24
          Originally posted by starshipeleven View Post
          Assuming you wanted to quote the other post as here it makes no sense.

          I can write the wrapper/helper libraries to avoid the issues you mentioned, anyone with basic programming knowledge can. I'm not good enough to write high-performance math libraries, or anything high-performance in general.

          But you didn't ask high performance so I think my claim is proven.


          EDIT: NVIDIA's revenue is like 1-2 billions per quarter https://ycharts.com/companies/NVDA/revenues so I'm pretty confident that 55k dollars is within rounding error for them.
          Just messing with you. Chill.

          Comment


          • #25
            Originally posted by droidhacker View Post
            Fortran? Really? The three people who still use that are enough to justify this?
            Fortran is still unchallenged as far as runtime performance goes. Also modern Fortran is not your old goto-prone Fortran77, it's a nice, readable and feature-rich language.

            Comment


            • #26
              Originally posted by lucasbekker View Post
              Having worked with some "old farts" at universities, the only real argument FOR Fortran that I have come across is the usual one for any legacy product. They don't have the time or motivation to port their code to C (or something else). The argument that Fortran is inherently better for these kind of scientific applications doesn't really convince me, as I have seen no real example that demonstrates this.
              It is nice to see any kind of open source effort on this front. But in the long run, I think it would be better if the applications would be ported to a language that doesn't discourage new developers to contribute.
              And why new developers should feel discouraged by Fortran? My impression is that Fortran is considered obsolete by old developers who don't know it and don't want to learn it. M.Sc. and Ph.D. students, who need it just as a tool for their work, simply learn it as they would have learned any other programming language, if necessary.

              Comment


              • #27
                Originally posted by lucasbekker View Post
                Having worked with some "old farts" at universities, the only real argument FOR Fortran that I have come across is the usual one for any legacy product. They don't have the time or motivation to port their code to C (or something else). The argument that Fortran is inherently better for these kind of scientific applications doesn't really convince me, as I have seen no real example that demonstrates this.
                It is nice to see any kind of open source effort on this front. But in the long run, I think it would be better if the applications would be ported to a language that doesn't discourage new developers to contribute.
                The advantages of FORTRAN for optimization are obvious. The language does not expose pointers, and so does not allow for the sorts of memory aliasing that bedevils C optimization. Or, to put it differently, it's a lot easier to understand the consequences of deeply nested loops and thus restructure them to optimize for cache locality, or to split them over different CPUs.
                It's not all perfect, of course. That very lack of pointers means that more sophisticated types of data structures (and the algorithms that use them) are hard to impossible to implement. But that doesn't change the fact that, for the class of computations where it works well, it DOES work well.

                Yes, sure, in theory one could write C or C++ that is so disciplined that it doesn't have these aliasing problems. But most programmers are idiots and, given facilities that can be abused, they will abuse them (and congratulate themselves for how smart they are for outwitting the language and the compiler...). We have endless historical evidence to this effect. Hell, read the endless wailing any time GCC or LLVM try to tighten up some technical aspect of the language, only to learn that
                - apparently some vital piece of Linux misuses this technicality and would break if the compiler actually enforced the rules and
                - no-one is interested in actually fixing that broken code.

                Comment

                Working...
                X