Announcement

Collapse
No announcement yet.

C++11 & The Long-Term Viability Of GCC Is Questioned

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • C++11 & The Long-Term Viability Of GCC Is Questioned

    Phoronix: C++11 & The Long-Term Viability Of GCC Is Questioned

    Back on Tuesday there was a basic email by a developer volleyed on the GCC mailing list, which has since sparked dozens of responses and a rather interesting conversation about the future of the GNU Compiler Collection and its ultimate path and viability moving forward. The initial e-mail was simply an inquiry asking about an estimated time-frame for having full support of the ISO C++11 specification. Diego Novillo, a well known GCC developer and Google employee, has even expressed fear that GCC may be past the tipping point and could die out naturally...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    From the article:

    I do see, however, a few areas where Clang/LLVM have gone that I do not think GCC is currently thinking of entering: "toolability" (for the lack of a better term). Clang's design follows a different path than g++. It's not just a code generating parser, it is a pure parser that also generates code. The difference makes it suitable for any kind of tool that needs to understand C++: static analyzers, code re-formatters, syntax highlighters, and other similar tools. Additionally, it is designed as a library, so it can be embedded into applications.
    ^Their. Own. Fault.

    Look up the history of GCC, the Free Software Foundation and RMS specifically and explicitly wanted GCC to be as monolithic and singular as possible so that closed source programs couldn't rip out individually pieces and incorporate them. I'm pro-FOSS and pro-OSS but the FSF's and RMS's shortsighted zeal is now catching up to them and have shoved them to this point: adapt or die.
    All opinions are my own not those of my employer if you know who they are.

    Comment


    • #3
      When reading things like this, I'm often curious - is this a case of hindsite being 20/20, where even the majority of GCC developers believe the technical design of LLVM has more long term potential, being newer and having learned from the systems that came before it? Gecko vs Webkit is another one. Does it become clearly obvious to all the developers involved over time that one design has just evolved better?

      Comment


      • #4
        Originally posted by hubick View Post
        When reading things like this, I'm often curious - is this a case of hindsite being 20/20, where even the majority of GCC developers believe the technical design of LLVM has more long term potential, being newer and having learned from the systems that came before it? Gecko vs Webkit is another one. Does it become clearly obvious to all the developers involved over time that one design has just evolved better?
        I've been reading the mailing list thread since Michael posted this article and...pretty much, yeah. Read my post above, that was the initial mistake they made. They've since made attempts to modularize GCC, and have been successful to a degree. But GCC was never designed around "Integration" when GCC was conceived the "IDE" of choice was vi(m) or emacs...those are text editors, not full blown IDE's. Now we have ACTUAL IDE's like Xcode or Visual Studio or code::blocks or QtCreator or KDevelop who have all come up with their own solutions, such as code analysis, because GCC didn't provide it. Fun fact..LLVM + clang does provide a lot of those things that are nice for IDE's.
        All opinions are my own not those of my employer if you know who they are.

        Comment


        • #5
          Originally posted by Ericg View Post
          From the article:



          ^Their. Own. Fault.

          Look up the history of GCC, the Free Software Foundation and RMS specifically and explicitly wanted GCC to be as monolithic and singular as possible so that closed source programs couldn't rip out individually pieces and incorporate them. I'm pro-FOSS and pro-OSS but the FSF's and RMS's shortsighted zeal is now catching up to them and have shoved them to this point: adapt or die.
          You contradict yourself with your RMS hate nonsense.

          The real reason it started out that way is because it was appropriate for that time. There was nothing shortsighted about it, and nothing is catching up. Times simply change and nowadays people want features no-one even dreamed about back in the day.

          As the article says, maybe it is just time to put GCC in a maintenance mode, and start from scratch with something that merges the best sides of GCC and LLVM.
          Last edited by varikonniemi; 26 January 2013, 03:01 PM.

          Comment


          • #6
            Originally posted by Ericg View Post
            From the article:



            ^Their. Own. Fault.

            Look up the history of GCC, the Free Software Foundation and RMS specifically and explicitly wanted GCC to be as monolithic and singular as possible so that closed source programs couldn't rip out individually pieces and incorporate them. I'm pro-FOSS and pro-OSS but the FSF's and RMS's shortsighted zeal is now catching up to them and have shoved them to this point: adapt or die.
            So that you think its a conspiracy that a project started in *1987* doesn't use modern design patterns?

            Comment


            • #7
              Originally posted by dalingrin View Post
              So that you think its a conspiracy that a project started in *1987* doesn't use modern design patterns?
              Not a conspiracy >_> look it up. And i'm not RMS and FSF hating, they have been done a lot of good in the world of software. BUT their original design decision for GCC was based completly out of zeal and WAS short-sighted. FSF was scared to death that companies were going to come by and find a way to pick out individual pieces of GCC to be used in closed source programs and bypass the GPL. There's actual messages, and theyve been posted on Phoronix before with links, where GCC developers specifically asked for GCC to be as inter-dependent as possible so that the ONLY way to use it was to use the whole thing as a giant blob.

              Btw, my call out to "modern design patterns" was in response to them trying to modularize GCC. They did what they could, taking it from the monolithic blob to something more modular but they couldnt take it all the way-- as far as LLVM + clang has, without tossing out much of the codebase. And as the mailing list points out, you cant garage GCC for 2years while the entire codebase is refactored to fit modern design patterns, unfortunately thats probably exactly what needs to be done. BUT they can't do it, any changes have to be done while the GCC is still being mainlined and developed.

              If they manage to pull this off, it will probably be a lot like what happened with the X server's rework for multiple GPU's... One developer is going to grab the tree, designate a source branch as the "rework" branch. Anyone wanting to help refactor GCC will submit patches to THAT developer, after every release of GCC that developer will rebase off mainline to stay current and keep working. And then one day a giant patch bomb will hit the mailing list that will refactor the entire stack.

              ^Seriously, when Michael was doing the articles about the X server rework for multiple gpu's (i forget who did..was it Keith or David?) If you followed the links to the mailing list, I think the one patch series was something like 90 patches that just got dumped to mailing list all at once because thats the way it had to be done.

              GCC can do this, I HOPE they do because competition is good. But its going to be a lot of work, a lot of bugs and a lot of headaches to do it. But I hope they can pull it off, and if they can, theyll be better for it.
              All opinions are my own not those of my employer if you know who they are.

              Comment


              • #8
                GCC is wildly successful. Read that again and repeat after me. It is wildly successful beyond the wildest dreams of its creators. It is a success story of epic proportions. It runs on every operating system on the planet, and is an essential piece of most of them, it produces some of the most optimised code for an unparalelled number of architectures. Everything from toasters to the most powerful supercomputers is powered by GCC. Remember that and repeat it a few times.

                Now, 25 years later, things have changed and people expect additional things, things that were never design goals for GCC. GCC will have to adapt to some of that, and it will have to simply refuse to do some of it. GCC doesn't HAVE to be used as a stand-alone lexical analiser. As long as it is a good compiler, there will be good uses for it.

                Perhaps a change of guard is inevitable as people expect tighter integration with their IDEs. Perhaps LLVM will become more popular as a result, if GCC does not adapt. But being ripped apart and shoved into Microsoft and Apple's IDEs was never a design goal for GCC. And this would still not take away from the incredible, outstanding success story that has been GCC.

                Comment


                • #9
                  Originally posted by dalingrin View Post
                  So that you think its a conspiracy that a project started in *1987* doesn't use modern design patterns?
                  Structured design and the concepts of modularity, encapsulation, loose coupling, etc., were already well-known by the early eighties.

                  Comment


                  • #10
                    Originally posted by Ericg
                    If they manage to pull this off, it will probably be a lot like what happened with the X server's rework for multiple GPU's... One developer is going to grab the tree, designate a source branch as the "rework" branch. Anyone wanting to help refactor GCC will submit patches to THAT developer, after every release of GCC that developer will rebase off mainline to stay current and keep working. And then one day a giant patch bomb will hit the mailing list that will refactor the entire stack.
                    EGCS all over again. It can be done. The question is if there are enough developers capable of doing this.

                    If a big Linux player decides that this is necessary (Novell, RedHat, Intel, etc.) then it will happen.

                    Comment

                    Working...
                    X