Originally posted by speculatrix
View Post
Announcement
Collapse
No announcement yet.
FLANG: NVIDIA Brings Fortran To LLVM
Collapse
X
-
-
Originally posted by toguro123 View PostCan you write a C++ (and extra libraries) to prove your claim?
I can write the wrapper/helper libraries to avoid the issues you mentioned, anyone with basic programming knowledge can. I'm not good enough to write high-performance math libraries, or anything high-performance in general.
But you didn't ask high performance so I think my claim is proven.
EDIT: NVIDIA's revenue is like 1-2 billions per quarter https://ycharts.com/companies/NVDA/revenues so I'm pretty confident that 55k dollars is within rounding error for them.Last edited by starshipeleven; 18 May 2017, 07:51 PM.
Comment
-
Originally posted by starshipeleven View PostAssuming you wanted to quote the other post as here it makes no sense.
I can write the wrapper/helper libraries to avoid the issues you mentioned, anyone with basic programming knowledge can. I'm not good enough to write high-performance math libraries, or anything high-performance in general.
But you didn't ask high performance so I think my claim is proven.
EDIT: NVIDIA's revenue is like 1-2 billions per quarter https://ycharts.com/companies/NVDA/revenues so I'm pretty confident that 55k dollars is within rounding error for them.
Comment
-
Originally posted by droidhacker View PostFortran? Really? The three people who still use that are enough to justify this?
- Likes 1
Comment
-
Originally posted by lucasbekker View PostHaving worked with some "old farts" at universities, the only real argument FOR Fortran that I have come across is the usual one for any legacy product. They don't have the time or motivation to port their code to C (or something else). The argument that Fortran is inherently better for these kind of scientific applications doesn't really convince me, as I have seen no real example that demonstrates this.
It is nice to see any kind of open source effort on this front. But in the long run, I think it would be better if the applications would be ported to a language that doesn't discourage new developers to contribute.
- Likes 1
Comment
-
Originally posted by lucasbekker View PostHaving worked with some "old farts" at universities, the only real argument FOR Fortran that I have come across is the usual one for any legacy product. They don't have the time or motivation to port their code to C (or something else). The argument that Fortran is inherently better for these kind of scientific applications doesn't really convince me, as I have seen no real example that demonstrates this.
It is nice to see any kind of open source effort on this front. But in the long run, I think it would be better if the applications would be ported to a language that doesn't discourage new developers to contribute.
It's not all perfect, of course. That very lack of pointers means that more sophisticated types of data structures (and the algorithms that use them) are hard to impossible to implement. But that doesn't change the fact that, for the class of computations where it works well, it DOES work well.
Yes, sure, in theory one could write C or C++ that is so disciplined that it doesn't have these aliasing problems. But most programmers are idiots and, given facilities that can be abused, they will abuse them (and congratulate themselves for how smart they are for outwitting the language and the compiler...). We have endless historical evidence to this effect. Hell, read the endless wailing any time GCC or LLVM try to tighten up some technical aspect of the language, only to learn that
- apparently some vital piece of Linux misuses this technicality and would break if the compiler actually enforced the rules and
- no-one is interested in actually fixing that broken code.
- Likes 1
Comment
Comment