Announcement

Collapse
No announcement yet.

GCC, GNU Toolchain Finally Working To Establish CI/CD For Better Reliability

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • GCC, GNU Toolchain Finally Working To Establish CI/CD For Better Reliability

    Phoronix: GCC, GNU Toolchain Finally Working To Establish CI/CD For Better Reliability

    For a project as large and complex as the GNU Compiler Collection (GCC) one would reasonably have assumed that it would have setup continuous integration / continuous delivery support years ago for helping to ensure the reliability of this widely-used open-source compiler and the GNU Toolchain at large. But that's actually only happening now in 2021...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    GNU is so sad. They contradict every popular opinion about open source. "open source codebases have more code quality", not GNU! "open source software is better tested", what's testing? I'm so glad the world is moving away from them and on to better projects like Clang.

    Comment


    • #3
      Originally posted by Ironmask View Post
      GNU is so sad. They contradict every popular opinion about open source. "open source codebases have more code quality", not GNU! "open source software is better tested", what's testing? I'm so glad the world is moving away from them and on to better projects like Clang.
      I am not glad the world is moving to Clang.
      If it wasn't for GNU, then Linux and the open-source movement would not have existed.

      Comment


      • #4
        There's actually always been CI-style testing going on in the background. What was missing is one centralized Authority with all the System/360s and HP/PAs and *BSDs and embedded SOCs and m68ks and so on you'd need to full CI, including all the cross-build and Canadian-cross-built environments. It's a good thing there's a rando tweet of someone offering all those resources including time commitment, system administration, and integration with upstream feedbacks. Good luck to them keeping this all going.

        Comment


        • #5
          Originally posted by phoronix View Post
          Phoronix: GCC, GNU Toolchain Finally Working To Establish CI/CD For Better Reliability

          For a project as large and complex as the GNU Compiler Collection (GCC) one would reasonably have assumed that it would have setup continuous integration / continuous delivery support years ago for helping to ensure the reliability of this widely-used open-source compiler and the GNU Toolchain at large. But that's actually only happening now in 2021...

          https://www.phoronix.com/scan.php?pa...oolchain-CI-CD
          The fact it is moving to .. build bot. That is what you used in 2010 when you had to build on weird platforms and was far behind in features/capabilities then.

          I was trying to work out why, then it dawned on me GitLab is closed source, Jenkins is MIT License (Open Source), Cruise Control is BSD (Open Source), as you go through the list Build Bot is the only GPL licensed CI tool that exists.

          I get the ideology, but at the end of the day choosing something like Jenkins that has the majority of the community and getting automatic Build Verification in place is more important for GCC. If using Open Source instead of "Free" source is upsetting, use the spare time created by having proper a Pull Request workflow to start building a decent solution. Although judging by the number of dead half finished GNU SCM/CI's I found while writing this post...

          Comment


          • #6
            Originally posted by bregma View Post
            There's actually always been CI-style testing going on in the background. What was missing is one centralized Authority with all the System/360s and HP/PAs and *BSDs and embedded SOCs and m68ks and so on you'd need to full CI, including all the cross-build and Canadian-cross-built environments. It's a good thing there's a rando tweet of someone offering all those resources including time commitment, system administration, and integration with upstream feedbacks. Good luck to them keeping this all going.
            This shows a misunderstanding of CI. Your Build agents should be 'cattle' created and destroyed by the CI, since we need specific hosts as part of the build process, those hosts should contain a minimum required installation (e.g. OpenSSH for the CI to reach it and a run time for a Build Agent to run). When a job is created all configuration files and runtimes should be deployed on to the build agent and the job constructed there (this should be torn down when the job ends). Your build system should thus be immutable and you resolve any developer "it works on my machine" problems. As part of this making builds reproducible (as in running a build twice should produce the same cryptographic hash of the build artifacts) becomes much easier.

            The effort here is developing the CI Plugins to deploy the necessary artifacts on to each agent but that should be a case of 1 plugin that deploys different version of the library depending on the spec of the build agent and something like Jenkins probably already has a plugin.

            Even if you skip that step, you should be running something like Ansible to manage each build agent and keep them in step.

            This is a time consuming problem not a hard problem.

            Comment


            • #7
              Originally posted by bregma View Post
              There's actually always been CI-style testing going on in the background. What was missing is one centralized Authority with all the System/360s and HP/PAs and *BSDs and embedded SOCs and m68ks and so on you'd need to full CI, including all the cross-build and Canadian-cross-built environments. It's a good thing there's a rando tweet of someone offering all those resources including time commitment, system administration, and integration with upstream feedbacks. Good luck to them keeping this all going.
              As mentioned in the article, the "rando tweet" is talking about work Red Hat is already doing to get all of this going. Whatever the infrastructure, getting more engineering resources to GCC is a good thing. I hope this allows them to iterate faster, especially on non-core projects like the lightning JIT.

              Comment


              • #8
                Oh wow DJ Delorie is involved, founder of the DJGPP project. I cut my teeth in C/C++ during the mid 90s using DJGPP's port of GCC to DOS before I got into Linux.

                Comment


                • #9
                  Originally posted by stevecrox View Post
                  If using Open Source instead of "Free" source is upsetting
                  Well MIT and BSD are free software. But they do not have copyleft.

                  Comment


                  • #10
                    Originally posted by Ironmask View Post
                    GNU is so sad. They contradict every popular opinion about open source. "open source codebases have more code quality", not GNU! "open source software is better tested", what's testing? I'm so glad the world is moving away from them and on to better projects like Clang.
                    Jesus google stuff a bit before posting ...

                    Comment

                    Working...
                    X