AMD Developing Next-Gen Fortran Compiler Based On Flang, Optimized For AMD GPUs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Old Grouch
    replied
    Originally posted by rickst29 View Post

    We agree that PYTHON run as interpreted code, and without using support libraries (such as numpy) will be vastly slower with number crunching. But 'HPC' projects should always be built to use those libraries, and they should also be compiled after debugging. I'll stand by my guess that Python will become more widely used over time.
    Python as 'glue' holding together the libraries written in Fortran is what happens now. It appears to be a reasonably good model. Python obviously won't replace compiled Fortran, but allows simple 'front-end' application development. As I said earlier, the value in Fortran is the battle-tested libraries for things like numerical simulations. There's little point in re-inventing a perfectly good wheel in a new language, badly - unless the new language offers significant benefits over and above Fortran for the things that Fortran is already very, very good at.
    I'm unlikely to write a GUI in Fortran, but I do know of business applications that were written in Fortran calling screen-handling (80x24, not bitmapped) libraries - that have been in use in industry for over 30 years. I would not recommend that approach now, but it was, at the very least, workable. Oddly enough, several attempts were made to replace the applications with more 'modern' approaches, all of which failed as they required far more machine resources to achieve the same tasks.

    I certainly don't think that Fortran is the best tool for all programming jobs. Far from it. But in cases where Fortran is doing what it is good at (outlined in other people's postings above) it is hard to beat. There is a huge amount of work that has gone into the scientific and technical libraries, and replicating even part of that legacy would take a lot of time and money, so it is difficult to make a case for replacing them that generates the necessary payback. It is interesting to speculate on what might offer such a payback.

    Leave a comment:


  • rickst29
    replied
    Originally posted by Svyatko View Post
    AMD's repo:





    I get a 512^3 array representing a Temperature distribution from a simulation (written in Fortran). The array is stored in a binary file that's about 1/2G in size. I need to know the minimum, maxim...




    Pure Python can be 10x - 100x times slower than Fortran/C/C++.
    We agree that PYTHON run as interpreted code, and without using support libraries (such as numpy) will be vastly slower with number crunching. But 'HPC' projects should always be built to use those libraries, and they should also be compiled after debugging. I'll stand by my guess that Python will become more widely used over time.

    Leave a comment:


  • Svyatko
    replied
    AMD's repo:



    Originally posted by rickst29 View Post

    Yes! Python (compiled) can be pretty fast when the mathematical work is done within well-optimized libraries (such as numpy).
    I get a 512^3 array representing a Temperature distribution from a simulation (written in Fortran). The array is stored in a binary file that's about 1/2G in size. I need to know the minimum, maxim...

    The numpy is faster because you wrote much more efficient code in python (and much of the numpy backend is written in optimized Fortran and C) and terribly inefficient code in Fortran.

    Pure Python can be 10x - 100x times slower than Fortran/C/C++.

    Leave a comment:


  • Svyatko
    replied
    Originally posted by cl333r View Post

    But everything that has a beginning has also an end. Isn't Fortran slowly faiding away? Are these corporations and scientists planning to run old fortran code for the next 150 years?
    Fortran holds #8 in TIOBE Index for Nov 2024:



    Great rise during last 2 years.
    Take care of your future - learn Fortran!
    Last edited by Svyatko; 16 November 2024, 02:54 PM.

    Leave a comment:


  • rickst29
    replied
    Python (compiled) can be pretty fast when the mathematical work is done within well-optimized libraries (such as numpy).

    But Fortran has a few a few inherent "advantages" in working with large arrays. In applications such as weather forecasting, these are typically multi-dimension arrays. Fortran, by default, does not alias those arrays, and the syntax supports very complex math involving multiple arrays at once (e.g. "a / b+ c / d = e" where ALL of the names refer to N-dimension arrays.) A smart compiler, typically used with a few "hints" provided by the programmer in specially formatted comments, can utilize all kinds of hardware features for super-fast pipelining of complex math on large arrays.

    In my much younger days (I have been retired for may years) I worked for Cray Research - specifically on testing and verification of their compilers. The first compiler at Cray Research was for Fortran (not C), and it was built to generate object code for the 'vectorized' hardware of their early "Super Computers". (It did utilize "hinting comments", in which the programmer could promise that hat "bounds checking will not be needed here", or "this loop iterator will never go out of bounds", and so on).

    Python is easier to learn and easier to write for general purposes. Fortran error handling is awkward, I/O routines are less friendly, and it has no model to isolate "objects" and manage memory being used for their instances. But complex math on non-aliased arrays in Python is a bit tougher to accomplish in Python, and the Python-based runtime will almost always be much slower than the same "job" optimized by a smart Fortran compiler and its compatible libraries.

    I expect Python to dominate most new projects, but Fortran will remain useful when the speed of the math processing is critical. In some cases, the difference in execution speed (within the mathematical processing can still reach 10-100x faster with Fortan-compiled code.

    But Python is much easier to learn. Error handling is awkward, I/O routines are unfriendly, and it is has no model to isolate objects and manage memory for those objects. Python is far easier to learn, and has a great future in heavy duty computing. But existing Fortran code can be further enhanced by persons willing to learn the language, and the code base is large -- weather forecasting models, simulated "mining" of oil fields and other minerals, simulating the the behavior of modeled "objects" (such as aircraft and cars) within a fluid, and the fluid dynamics themselves are all things which I worked with in distant past (even before 1990). Many newer areas of biology, chemistry, and are also using Fortran to build models and simulations .
    Last edited by rickst29; 16 November 2024, 03:19 PM.

    Leave a comment:


  • Old Grouch
    replied
    Originally posted by L_A_G View Post
    Never had to write any myself, but I've worked with Matlab/Octave which are largely written in it and have many of the same... "Characteristics"... Differences like row vs column major order against every single other programming and scripting language I've ever used genuinely made me feel like throwing a CRT monitor thru a window.
    I have great sympathy for your difficulties, but from the other side of the fence. It is 'just' a convention, but when you are used to it, trying to use it in the opposite sense generates bugs and mental contortions one could do without.

    Leave a comment:


  • Old Grouch
    replied
    Originally posted by LtdJorge View Post

    The Python libraries are written in C, C++, Fortran or, lately, Rust underneath, tho.
    As I said "The standardised, validated, battle-tested library routines are what have value". Other commentators have pointed out why Fortran can still be a valid choice in certain applications. It's not ideal for everything, but in particular niches, it probably cannot easily be beaten.

    Leave a comment:


  • LtdJorge
    replied
    Originally posted by Old Grouch View Post

    To answer your first question, almost certainly, yes. Python has taken over for the most part general purpose scientific/technical computing. COBOL is also slowly fading away - I don't think anyone would specify COBOL for a new application these days. Fortran is still a valid choice for big numerical simulations - read a data file, do some number crunching, write results file. The standardised, validated, battle-tested library routines are what have value, and something better will come along eventually.
    Will Fortran last for 150 years? No idea, but it would not surprise me if it were still in use. I'd be less sure about the staying power of Java, for example.
    The Python libraries are written in C, C++, Fortran or, lately, Rust underneath, tho.

    Leave a comment:


  • onlyLinuxLuvUBack
    replied
    Originally posted by pokeballs View Post
    They should stop wasting time and focus on making ROCm actually enticing.
    CUDA works and is officially supported on every single NVIDIA GPU. It offers lots of built-in goodies to make your life easier.
    Until AMD realize ROCm is inferior to every other solutions, they'll continue with their HIP to "help you port you code"...
    But why would I ever want to port my code to this joke?
    add systemd and youll be loving it: "systemd_fortran_amd"

    Leave a comment:


  • AdrianBc
    replied
    Originally posted by cl333r View Post

    But everything that has a beginning has also an end. Isn't Fortran slowly faiding away? Are these corporations and scientists planning to run old fortran code for the next 150 years?


    C and all languages derived from it, not only C++, but also those that have evolved farther apart, like Java or Rust, have much poorer support for arrays that even the first proposal for Fortran from 1954, i.e. 70 years ago.

    Modern Fortran includes most features that can be found in any other modern programming language, while having much better support for operations that handle arrays.

    In theory, any programming language that has means for defining abstract data types and for overloading operators and functions, for instance C++, could be used to write some libraries that would eventually provide the same convenience for writing programs for scientific/technical computing like Fortran.

    In practice, I am not aware of any open source C++ libraries that would be good enough.

    Other language extensions, like Python with NumPy, are much uglier and less convenient than Fortran and than what could be implemented in C++, besides being extremely slow for anything that does not consist mostly in invoking library functions written in other languages than Python.

    For writing programs for GPUs, Fortran remains an excellent choice. Used wisely, for programs where most of the work consists in array operations, Fortran allows writing programs that are simpler, smaller and easier to read than any other modern choices.

    So no, Fortran is not at all obsolete, because it does not have any good replacement. Obviously, it is good only for a delimited domain of applications and it would be completely unsuitable for other applications, like writing a text editor or a Web browser.







    Leave a comment:

Working...
X