Announcement

Collapse
No announcement yet.

Fedora Stakeholders Debate Statically Linking Python For Better Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #71
    Originally posted by Vistaus View Post
    But does a mature language need a lot of new code?
    I'm not sure what you mean here. People still write a ton of C (or C-like code) even in 2019, and C is very mature.

    If we are talking about research, then each experiment or model or test they do they write new code to make the system do different calculations (different expressions, matrix, whatever). Or at least it is much more likely that they will need to change something for the next test or experiment.

    It's different from businness (classic example is banks) where the core logic does not really change in decades because they keep doing the same thing over and over and over, so you still have ancient codebases in Cobol.

    Comment


    • #72
      Originally posted by endrebjorsvik View Post

      Idiomatic Go does not even depend on libc. Go encourages you to not use libc, and linking against it is disabled by default when you cross-compile.
      You are correct regarding how other dependencies work in Go. All dependencies are statically linked into one fat binary together with the supplied Go-runtime. Another thing to note about dependencies in Go projects, is that they very often end up picking a plethora of random libraries from east and west on Github. As an example, consider the dependency list for Kubernetes. This is also partly encouraged by the Go eco-system and toolchain.
      ...but, when that "abstract away the platform so thoroughly that we're not even depending on libc" approach leaks, it leaks hard, resulting in blog posts like this:



      Originally posted by wizard69 View Post

      This is a pythons strong point I think everybody understands thAt! The problem with that attitude is that new technologies get ignored and struggle to build that infrastructure. It is pretty easy to see the advantages of new languages like Swift, Julia and others but if they aren’t used and don’t have Pythons infrastructure they will never grow into Python replacements.

      frankly I do believe a Python replacement is needed. The lack of any potential for performance out of arbitrary code is a big factor that will never be overcome. Other issues like a sound distribution model (maybe static linking will help here) are also hold backs.

      in any event the point here is that if new technologies aren’t explored then we will never gain new solutions.
      Point. In hindsight, I was assuming perfect knowledge of the alternatives in imputing a motivation for choosing Python.

      Comment


      • #73
        Originally posted by coder View Post
        Yes, in decades past.

        These days, fortran seems to be a direct dependency only of older libraries, with not much new code being written in it.
        Mostly it is somehow "burried" as Linear Algebra, Math etc Lib on your system like BLAS, ARPACK, FFTW, LAPACK, ACML, MKL ...what so ever. But each modern "research-friendly" Language or Environment is using it.
        This stuff for sure is not dead - have a look at e.g. OpenBLAS, GraphBLAS or NVBLAS. It is still under development and adjusted to the latest CPU architectures, "ported" as GPU offload, or reimplemented in new algorithms.

        Comment


        • #74
          Originally posted by Vistaus View Post
          But does a mature language need a lot of new code?
          Well, if you figure the old code is continually getting replaced (probably according to some sort of half-life function), then a language that's no longer used for new code will eventually die off. An enterprising individual could plot the number of fortran dependencies in some distro with a periodic release cycle. It'd be interesting to see if it indeed followed an exponential decay.

          As for why it tends to get replaced, consider that the type of numerical code for which fortran-based libraries are still used is typically worth optimizing to use SIMD instruction set extensions, heavy multi-threading, or even GP GPU. While you can do all of those things in fortran (see OpenMP, OpenACC), there are sometimes big benefits to be had from hand-crafting an implementation for a given technology. Otherwise, why aren't most people still using the original BLAS or fortran-based FFT implementations?

          Comment


          • #75
            Originally posted by CochainComplex View Post
            Mostly it is somehow "burried" as Linear Algebra, Math etc Lib on your system like BLAS, ARPACK, FFTW, LAPACK, ACML, MKL ...what so ever. But each modern "research-friendly" Language or Environment is using it.
            FFTW does not use it. ACML and MKL aren't even in my distro. Things like LAPACK tend to get replaced with more modern & language-native libraries, like Eigen.

            And, as I said in the above post, implementations based on new technology do not use it, such as cuBLAS and MKL-DNN.

            Originally posted by CochainComplex View Post
            This stuff for sure is not dead - have a look at e.g. OpenBLAS, GraphBLAS or NVBLAS. It is still under development and adjusted to the latest CPU architectures, "ported" as GPU offload, or reimplemented in new algorithms.
            You're just citing the long tail of the curve. Until every single one of those stragglers dies off (which they won't, as long as virtually anything still depends on them), people like you will keep claiming it's not dead.

            Well, friend, the parrot might not be dead, but he's far from lively.
            Last edited by coder; 16 November 2019, 04:10 PM.

            Comment


            • #76
              Originally posted by CochainComplex View Post
              Speaking from the scientific community. python becomes more and more an opensource alternative to matlab or mathematica.
              The obvious open source alternative to Matlab is Octave. Is it not good enough? (I've never used either.)

              This group has worked on optimizing MatLab Programs. I don't know the current state of their work http://www.sable.mcgill.ca/mclab/projects/

              A previous reply mentioned Julia. I imagine that that would be a good alternative if you don't mind recoding.

              (The least expensive way I know of for getting a Mathematica license is by buying a Raspberry Pi.)

              Comment


              • #77
                Originally posted by ssokolow View Post
                ...but, when that "abstract away the platform so thoroughly that we're not even depending on libc" approach leaks, it leaks hard, resulting in blog posts like this:

                https://marcan.st/2017/12/debugging-...o-runtime-bug/
                Wow. I'll bet that guy could've saved like a week of debugging, if he'd just run it in valgrind.

                Aside from that, it was an impressive effort. Reminds me of a compiler bug I once tracked down, on an embedded platform devoid of such memory checking tools.

                Comment


                • #78
                  Originally posted by coder View Post
                  FFTW does not use it. ACML and MKL aren't even in my distro. Things like LAPACK tend to get replaced with more modern & language-native libraries, like Eigen.

                  And, as I said in the above post, implementations based on new technology do not use it, such as cuBLAS and MKL-DNN.


                  You're just citing the long tail of the curve. Until every single one of those stragglers dies off (which they won't, as long as virtually anything still depends on them), people like you will keep claiming it's not dead.

                  Well, friend, the parrot might not be dead, but he's far from lively.
                  First I have to admit you are right about FFTW.


                  I dont know what distribution you are using but the coder might have a look at this:

                  Since your not using octave:


                  There you will find all the mentioned fortran containing libraries and also acml (lamd) ... according to your name im assuming you will be able to find the dependencies...

                  MKL can be used to speedup Matlab also Comsol Multiphysics - but proprietary (edit: it became freeware recently). So thats why it is not in your distro, mate.
                  You might have a look here https://software.intel.com/en-us/one...apack-routines

                  Try to build OpenBLAS without Fortran.
                  LAPACK is part of OpenBLAS try to exclude the Folder Lapack-netlib ...it will not build.
                  Just recently I have stumbled across an build error if march=skylake is set. it tried to build the fortranparts
                  with avx512 even if skylake does not imply avx512 per se....

                  So if Fortran is dead why is Intel still selling updated versions of Fortrancompiler ?

                  If it is used it is simply not dead...
                  Last edited by CochainComplex; 17 November 2019, 11:23 AM.

                  Comment


                  • #79
                    Originally posted by Hugh View Post
                    The obvious open source alternative to Matlab is Octave. Is it not good enough? (I've never used either.)

                    This group has worked on optimizing MatLab Programs. I don't know the current state of their work http://www.sable.mcgill.ca/mclab/projects/

                    A previous reply mentioned Julia. I imagine that that would be a good alternative if you don't mind recoding.

                    (The least expensive way I know of for getting a Mathematica license is by buying a Raspberry Pi.)
                    At the moment I'm using Octave - but in certain situations it is much slower. Octave Devs tried to implement some JIT functionality but there hasn't been a lot of development recently. It also only works with some specific llvm versions.

                    At my institute I have the possibiltiy to use Mathematica. But at least in a research facility funded by public money FOSS should be prefered over proprietary software.

                    Comment


                    • #80
                      Originally posted by CochainComplex View Post
                      I dont know what distribution you are using but the coder might have a look at this:

                      Since your not using octave:


                      There you will find all the mentioned fortran containing libraries and also acml (lamd) ... according to your name im assuming you will be able to find the dependencies...

                      MKL can be used to speedup Matlab also Comsol Multiphysics - but proprietary (edit: it became freeware recently). So thats why it is not in your distro, mate.
                      You might have a look here https://software.intel.com/en-us/one...apack-routines
                      First, I'll tip my hat to your detailed post. I appreciate the inclusion of references & links.

                      Originally posted by CochainComplex View Post
                      Try to build OpenBLAS without Fortran. LAPACK is part of OpenBLAS try to exclude the Folder Lapack-netlib ...it will not build.
                      I didn't say anything about OpenBLAS, and neither had you.

                      Originally posted by CochainComplex View Post
                      Just recently I have stumbled across an build error if march=skylake is set. it tried to build the fortranparts
                      with avx512 even if skylake does not imply avx512 per se....
                      Um, it sounds like a bug in the code, which is mistakenly trying to enable avx512 for all Skylake archs, even though it's only Skylake-SP (or whatever they call the server cores) that has any of it.

                      But, I guess your point was that you hit an error in some fortran code - thus, you're now aware that it used fortran. Okay, but that point was never in dispute.

                      Originally posted by CochainComplex View Post
                      So if Fortran is dead why is Intel still selling updated versions of Fortrancompiler ?

                      If it is used it is simply not dead...
                      This is a silly argument, because I think we both basically agree on the facts. People are still using stuff that was written in Fortran, even if they're not using it for new code.

                      As for why Intel still cares about it - that's because a handful of venerable libraries use it, which are commonly used - including by some high-profile HPC benchmarks. That doesn't change the fact that probably at least 99.9% of new scientific code is not written in fortran.

                      Comment

                      Working...
                      X