Announcement

Collapse
No announcement yet.

Formalizing The LLVMLinux Project: Clang'ing Kernels

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Formalizing The LLVMLinux Project: Clang'ing Kernels

    Phoronix: Formalizing The LLVMLinux Project: Clang'ing Kernels

    Interest in building the Linux kernel through the LLVM/Clang compiler rather than GCC continues to grow. The consolidated LLVMLinux project was announced last week...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Question is.... WHY?

    Comment


    • #3
      Originally posted by droidhacker View Post
      Question is.... WHY?
      *burp* WHY NOT!?

      Comment


      • #4
        Originally posted by droidhacker View Post
        Question is.... WHY?
        A few reasons come to mind:

        1) "Scratch your own itch" nature of open source.

        2) Clang has better diagnostics ( i know GCC 4.7 has very much improved ones, but while they were working on fixing their own, im sure Clang made theirs even better). Which means that build errors are more easily spotted, also Clang (and LLVM) compiling is done in stages, so if someone needs to jump into the middle of the compilation process they can do it with LLVM, with GCC you give it the code, hit compile and cant intervene in any way until you have the binary.

        3) LLVM can be integrated into IDE's more easily thanks to its modularity. Think Intellisense from Visual Studio.

        4) GCC has a lot of extensions that aren't part of mainline C++ standard. Thats not necessarily bad but it restricts who can all build the Linux Kernel. If tomorrow a patch hits the LLVM mailing list that RADICALLY improves performance of binaries across the board, to the point of being better GCC... but the kernel relies on GCC-specific extensions, then we're stuck with a slower binary. By being able to build with multiple compilers the kernel devs are ensuring that if a better compiler comes out tomorrow that we'll be able to just jump onto that.

        5) Competition. GCC has a monopoly on linux development, i dont mean just the kernel i mean over all. Thankfully this monopoly isnt like the corporate world where everyone gets screwed, but because GCC now has a real competitor, even if its only in a few ways, then they'll ideally try to work harder to make sure GCC stays topdog, meanwhile LLVM will be playing catchup to improve performance, improve debugging, improve modularity and in the end both compilers become better for it.



        *) Side note, LLVM is very modular, one complaint that I hear a lot about the GCC codebase is that its a giant static ball of spaghetti and rewriting to be modular (like they admit they should have done since the beginning) isn't worth the time at this point. So there's a very real chance that one day the developers will look at the codebase, and go "We can either abandon it...or rewrite it completly from scratch" Some will say abandon, some will say rewrite, who knows which one will win out. But if the choice is "abandon" then it'd be nice to have LLVM in a good, competitive state already so that the community isnt waiting around as a new compiler is being written from scratch, and instead can just jump to LLVM/Clang.
        All opinions are my own not those of my employer if you know who they are.

        Comment


        • #5
          Benchmarks

          can we get some benchmarks?
          I feel like the Clang Linux would perform better -- all bugs considered.

          Comment


          • #6
            Originally posted by coder543 View Post
            can we get some benchmarks?
            I feel like the Clang Linux would perform better -- all bugs considered.
            Doubtful at this point, but who cares? Benchmarks rarely tell you anything about how a general system will feel, and saving developer time is a lot more important than shaving a few milliseconds of usera' runtime in most cases. And in the other cases, GCC/ICC are still around until LLVM catches up fully.

            Comment


            • #7
              There is value in compiling with different compilers in it self, even if no one uses the binaries for anything else than tests. In the process of making it work you remove non standard stuff and things that only works with the first compiler. When the code works with two different compilers fixing things for a third is easier and even less to change for the forth.

              Of course provided that the compilers are fairly good (that is, has decent support for the language, has reasonable __builtins etc) and you take the task somewhat seriously and don't just hack around all differences with if-defs ).

              On top of that, as mention above, you get the benefit of different static analysis tools.

              Comment


              • #8
                Originally posted by Ericg View Post
                A few reasons come to mind:
                3) LLVM can be integrated into IDE's more easily thanks to its modularity. Think Intellisense from Visual Studio.
                Well afaik you can do the same thing using GCC's plugin architecture.

                Originally posted by Ericg View Post
                but the kernel relies on GCC-specific extensions, then we're stuck with a slower binary. By being able to build with multiple compilers the kernel devs are ensuring that if a better compiler comes out tomorrow that we'll be able to just jump onto that.
                Well the extensions the kernel uses are mainly for improved performance, also the kernel devs not only chose to use those extensions but in many cases proposed them to begin with. That said I think it's great if it can be compiled with more toolchains, however I think that Clang/LLVM will have to support those extensions for that to happen, as I really doubt the kernel devs will be willing to give up extensions (particularly those they have requested) just so that it can be compiled with Clang/LLVM. Either that or continue to use the not so attractive alternative of maintaining patches.

                Also the situation is not comparable to that of FreeBSD which was stuck at shipping GCC 4.2 (due to licence politics), this was not a good situation for them and when Clang/LLVM showed up which was also licenced with a BSD compatible licence it was worth it for them to embark on a long journey of making FreeBSD compiling under Clang/LLVM. Linux on the other hand have no problems utilizing GCC, and as mentioned GCC supplies lots of extensions which the Linux kernel devs depend upon. So I can see why there is no effort made from the core Linux devs in this respect.

                Originally posted by Ericg View Post
                try to work harder to make sure GCC stays topdog, meanwhile LLVM will be playing catchup to improve performance, improve debugging, improve modularity and in the end both compilers become better for it.
                Exactly, this is why it's great to have these two strong open source toolchains, only better thing I can think of would be to have even more competition, I was hoping that Pathscale would be a third strong option but I haven't heard anything concrete after the open source announcement.

                Comment


                • #9
                  Originally posted by XorEaxEax View Post
                  Well afaik you can do the same thing using GCC's plugin architecture.
                  Yes GCC supports plugins but LLVM does it even with core features. So if you want to stop the compilation process right in the middle so that you can tweak something about it, or see what it has done so far, you can do it.


                  Originally posted by XorEaxEax View Post
                  Well the extensions the kernel uses are mainly for improved performance, also the kernel devs not only chose to use those extensions but in many cases proposed them to begin with. That said I think it's great if it can be compiled with more toolchains, however I think that Clang/LLVM will have to support those extensions for that to happen, as I really doubt the kernel devs will be willing to give up extensions (particularly those they have requested) just so that it can be compiled with Clang/LLVM. Either that or continue to use the not so attractive alternative of maintaining patches.
                  I'm aware that a lot of the extensions were created at the kernel developers request, and yes LLVM will have to initially support them just so it builds correctly. But I was thinking more along the lines of the next update to the C-standard (isnt there one in the works? C++ just got updated for 2011, and I heard something about an updated C standard), if the extensions are truely good ideas then thy should be integrated into the standard. Which Clang would then support anyway. And if they are just good side projects then the pieces of code in the kernel that rely upon those "side project" extensions should be looked at to make sure they cant be done in a way that is inline with the C standard.

                  Honestly, if the ONLY thing that comes out of cross-building the kernel with GCC vs LLVM is that the kernel source gets a good hard look-over and that things are double-checked to make sure that they are written the way SHOULD be written, then i consider this whole thing a win. We can't let the kernel code just sit there without being occasionally looked at, thats how things stagnate become forgotten about. I'm sure theres parts of the stack that Linus could legitimately say he hasn't looked at since he through the original source code up on the FTP server back in the 90's. And thats just not a good place to be in because I can promise that one day someone will look at that code and go "Why didnt this get rewritten like...10 years ago?" And by then who knows what relies on that piece of the stack being that specific way.



                  Originally posted by XorEaxEax View Post
                  Exactly, this is why it's great to have these two strong open source toolchains, only better thing I can think of would be to have even more competition, I was hoping that Pathscale would be a third strong option but I haven't heard anything concrete after the open source announcement.
                  Yeah I was looking forward to where Pathscale would go, but it seems to have just fallen off the deep-end after the initial announcement.
                  All opinions are my own not those of my employer if you know who they are.

                  Comment


                  • #10
                    Originally posted by Qaz` View Post
                    Of course provided that the compilers are fairly good (that is, has decent support for the language, has reasonable __builtins etc) and you take the task somewhat seriously and don't just hack around all differences with if-defs ).
                    We all know that in the end it will be hacked around with if-defs.

                    Comment

                    Working...
                    X