Announcement

Collapse
No announcement yet.

Crash with mplayer compilation

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Crash with mplayer compilation

    Hi I have a problem with the universe benchmark cause it crash with the mplayer compilation.
    It compiles well without the phoronix test suite.
    I'm running opensuse 10.3 64 bits.
    Thank you.

  • #2
    Does not crash on compilation here using openSUSE 10.3 x64 as well.

    Comment


    • #3
      Originally posted by deanjo View Post
      Does not crash on compilation here using openSUSE 10.3 x64 as well.
      I don't understand cause it compiles well with kernel and many more.
      My pc is stable.

      Comment


      • #4
        It would be nice if the compilation could be observed and not have it surpressed to troubleshoot the compilation problems

        Comment


        • #5
          I had similar problems, which I tracked down. mplayer probably failed due to RAM usage. I ran this first on a system with 2GB of RAM and probably 1GB swap (not modern -- a Compaq SP750 with dual P3-866s.. an AMD Phenom on the site is exactly 5x faster per core, 10x faster overall due to having twice the cores... but the SP750 sure seems snappy.) No problems. I then tried running it on my home systems, which are much newer Athlon XPs.. they essentially locked up. Well, these have 512MB of RAM and 1.5GB swap instead, one I had top running on showed all 20 or so processes as "cc1" before it started thrasing too hard for even update top. I'd guess mplayer bench died for you because you have somewhere in the 512MB-2GB range of RAM and instead of swapping to death it just plain was out of memory and killed some copies of cc1.

          I found the phoronix-test-suite has uses of "NUM_CPU_JOBS" through it, but the variable's never set. The builds run "make -j $NUM_CPU_JOBS foo.." Make's "-j" option sets how many jobs (usually copies of gcc) to run at once.. and "-j" by itself means an unlimited number. So, the mplayer build lets like 50 gcc's fire off at once and the system runs out of RAM.

          I ran "export NUM_CPU_JOBS=2" before running the phoronix test suite and the compiling looks much more under control. (Adjust NUM_CPU_JOBS as appropriate to your number of cores etc.. I used 3 on my hyperthreader.
          Last edited by hwertz; 04-19-2008, 01:12 AM.

          Comment


          • #6
            Very interesting, I did not know that -j alone means unlimited, I could have found that too, well I just added a bit more swap

            Comment


            • #7
              Hi i ran the command before running the test and it just works. Thank you

              Comment


              • #8
                Oops, this will be fixed in git. I didn't realize I changed the variable from NUM_CPU_JOBS to SYS_CPU_JOBS for what was being exported.... I'll turn it back to NUM_CPU_JOBS so that all the scripts should be working again properly.
                Michael Larabel
                http://www.michaellarabel.com/

                Comment

                Working...
                X