Announcement

Collapse
No announcement yet.

Android NDK r16: Developers Should Start Using LLVM's libc++ With GCC On The Way Out

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Luke_Wolf View Post

    You do realize that Google is one of the primary developers of LLVM/Clang right? Right?
    In fact their contribution share is fairly small, like 4% or so.

    Originally posted by davidbepo View Post
    why all this hate to clang and llvm, they are a very good compiler and infrastructure
    There is nothing wrong with clang of llvm, the problem is google pulling the rug from under existing projects that rely on gcc or frameworks that will be delayed and potentially broken by the forced transition to a completely different toolchain.

    Comment


    • #12
      How does this change affect other projects/languages that target android? Like apps build with Rust?

      Comment


      • #13
        From what I can tell, libc++ is only officially supported on Mac and freeBSD and is limited to c++11 on Linux.

        Comment


        • #14
          While regrettable and something that would definitely annoy me if I was writing android native apps, I can understand why they did it.

          GCC's rather slow adoption of the newer additions to C++ is something I've even personally had to deal with, but I don't think it's the main reason for phasing it out. No, my feeling on the matter is that the Google considers the additional work required for adding new features and backporting them to GCC is making the addition of new features unnecessarily time and resource consuming. Not only do you essentially have to do the same work twice, the less modular structure of GCC causes one of these times to be considerably more work.

          As annoying as this is to people who have built their development environments and software around GCC and it's features, you can't expect Google to essentially support multiple implementations of the same thing and not to occasionally make decisions like this based on grounds of resource prioritization.

          Comment


          • #15
            Originally posted by L_A_G View Post
            I don't think it's the main reason for phasing it out.
            Ack! Did you see my earlier post (the first one)? No need to guess why they did it, when you can read the (semi) official statement!

            Comment


            • #16
              Originally posted by trivialfis
              I prefer cmake over autotools, it's easier to work with and I don't like m4.
              But reinventing the compiler doesn't seem logical to me. I mean, yeah, GCC is not modularized like LLVM so features like LSP or integrating in OpenCL can not be easily achieved. But I think refactoring it is much easier than building a whole new compiler, right?
              It is good to challenge the ways we used to do things. We may find better ways of doing things and this makes it worthwhile. That said, I have little problems with the autotools and the best I can say about cmake is that it needs more work. With a configure script can I just use "--help" to get a quick view on the options, but with cmake do I first need to find the documentation and the part about which -DaVariable: ofSomeType=getsTheValue that I need, and that's tedious. Then to watch endless "[12%] Building C object in/some/directory/by/the/name/of/fileA.c.o" isn't an improvement either. The good, old make may produce more output, but at least one can see the entire command line and gets to catch errors much better, which is simply more valuable to me. Nor is cmake much faster in testing for functions than a configure script, but produces two lines of output for it (i.e., one line with "-- Looking for vfork" and the second with "-- Looking for vfork - found"). That, too, needs to be done better and maybe this is what we are going to see.

              I don't think the old tool chain is the best we can do. CPUs are only getting more and more cores and perhaps we can put some parallelization directly into the tools. So should cmake and configure not just search for the presence of functions one by one, but there is room to make this a lot faster. Even within a compiler can I see room for parallelization, i.e. by building a multi-threaded pipeline where preprocessing, lexical analysis, parsing and translation (into IR) is done in one thread, optimization in a second, and output gets handled with a third thread. This then decouples I/O from the compute heavy parts within a compiler, but it would also allow to include assembler and linker into it. I don't believe keeping these parts of the build chain currently in large, separate processes is the final answer.
              Last edited by sdack; 07 September 2017, 03:49 AM.

              Comment


              • #17
                Originally posted by coder View Post
                Ack! Did you see my earlier post (the first one)?
                No I didn't... I can see you think very highly of yourself with the way you expect everyone to read your post before posting a comment, but you can't assume that. Most people will just read the article and then post their comment rather than go trough pages of pages of comments before doing that.

                Comment


                • #18
                  Originally posted by nadro View Post
                  gnustl is in really poor state eg. problems with locale or missing trivial C++11 features like 'std::to_string'. I also prefer clang++ than g++ and use this compiler in all my projects (only to compile stuff for Windows I still use MS compiler) since 2009.
                  That's because Google stopped updating it at an old version (4.9) while we are at 7.2 so of course it misses features. Recent versions of GCC were faster to get some C++ features than Clang.

                  Comment


                  • #19
                    Originally posted by sdack View Post
                    There are a few, but I'm not sure if these are all real problems or only artificial ones.

                    There is the complexity of gcc and the fact that it's written in C. Some dislike this and feel more at home with a leaner compiler and with more code in C++. Some associate "new" with "better" and "starting from scratch" as "good" or hate having to work with older generations of developers. Some also feel they're having no choice even when the one available is a long-standing open source compiler. I don't believe there are real technical problems behind it, but these are problems nevertheless.

                    There is currently a trend to reinvent the entire tool chain and not just gcc. So does the GNU linker now get competition from gold and the llvm linker lld. People also dislike autoconf and make, but prefer tools such as cmake, meson, ninja, and others. Some use them in combination, others as a replacement, but there is a growing trend that tries to change the way we build C/C++ software on Linux.

                    We will have to see what finds followers, what gets copied and what remains standing, and what gets abandoned. There are many opinions floating around, most of them seem to be personal preferences based on what people started out with.
                    There's only a single reason and it has nothing to do with your list. Companies (even supposedly open-source friendly like Google) are afraid of GPL as it doesn't allow them to keep code secret. This was the reason Sony chose FreeBSD and Clang for PS4 and is the reason many companies prefer LLVM as the foundation for their custom products.

                    Comment


                    • #20
                      First of all there is the licence. Some people prefer completely free licences (like MIT, BSD etc) over ones that put more restrictions on your work (like GPL). What is more free is surely something people can debate intensely for a long time, but I always feel GPL is less free than MIT.

                      And on the point of project no longer being able to compile with clang. Sorry, but I have always felt that using/relying on language extensions, especially GNU extensions, is a big error in itself. It is so great that we finally have a second good and popular open source compiler, so people are pointed to bugs in their code (also cases that e.g. GCC erroneously accept as valid while in fact the standard says that code is wrong).
                      So yes, it is a pain for big and old projects, but it should yield much better code that is more standards compliant, when you are forced to make it compile on Clang.

                      Comment

                      Working...
                      X