Announcement

Collapse
No announcement yet.

Intel Ivy Bridge: GCC 4.8 vs. LLVM/Clang 3.2 SVN

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    If i'm not mistaken, GCC's focus was never about performance. They've always said, a fully cross-platform and feature full compiler was their primary goal.

    Comment


    • #17
      Originally posted by allquixotic View Post
      Unless llvm literally copy and pastes much of the micro optimization stuff from gcc, there is no reason to think that they will implement those expensive optimizations in any sort of reasonable timeframe.
      4 years ago, MSVC's compiler was generally considered to produce _much_ better optimized code than GCC, to the point that folks on proprietary OSes wondered what the hell people on Linux were thinking in using such a crappy compiler. Yet here GCC is today, producing _much_ better code in general than cl.exe does. Shock and surprise! Compilers can add optimizations! Stop the presses!

      Optimizations aren't waiting for some super secret magic voodoo that only one guy working on one compiler knows about. They're just waiting for someone to spend the time to implement them, after chewing through the long list of other things to do.

      I can perhaps understand Google working with GCC's internals. But why in God's name would an application development company like Facebook have to even look at the GCC source code?
      Facebook employs an entire compiler division, who currently all work Clang. Quite simply, Facebook writes craploads of C++ code, has to maintain codebases and performance levels similar to what Google does, and -- again -- developer tools are the most important thing there possibly are to invest in if you're a company that pays tons of engineers to develop code. Investing in compilers saves Facebook money. Hence, they invest in compilers. And after researching, they invested in Clang, not GCC.

      Facebook also invests in tons of other things that you apparently don't think they have any right investing in. I suppose you think all they do is run PHP, MySQL, and some RHEL Linux machines to run their big ol' website, just like your personal Wordpress site runs on a LAMP machine in your bedroom closet. Same thing, right?

      Stick with well-supported constructs and paradigms (design patterns) and you can type out hundreds of lines of code at a time with an extremely small warning/error rate, without even using an IDE. It's not rocket science.
      The difference in efficiency is massive. yes, you _can_ write code without good tools or diagnostics. However, you will write more and better code with better error handling and tools.

      God knows I thought like you do for years, having let myself become convinced that Vim and GDB were 10000x better than any IDE could ever be. Heck, I _hated_ Visual Studio the first time I was forced to use it; it didn't have any cool text manipulation keybindings at all, and its regular expression search-and-replace was borderline useless! Of course, the difference in efficiency I find in myself now when I switch between Vim on Linux servers and VS on Windows desktop is ridiculously huge, which is why I pay attention to Clang and the integration it's gaining in tools like Vim. Sure, I still want VS to improve more than it ever will... which is why Clang allowing some new and improved IDEs to start taking over where VS currently reigns is amazing.

      If you don't believe me, fine. It's not my time or money you're wasting, so no skin off my back.

      Anyway, the whole point of a compiler is that it is a tool; it's not the end product in itself. Having complex internals vs. well-documented and elegant internals does not add any sort of value to the end product. Having a compiler that produces fast code (or small code, depending on your needs) is a value-added attribute. Having a compiler that is simply well-designed is not, by itself, a value-added attribute. If forced to choose between a compiler that has more/better value-added attributes versus one that does not, it should be a no-brainer for anybody who has ever taken a business class, or even anyone whose goal is to deliver high quality products to whomever their customers are.
      The point of the internals is that it _enables_ value-add. It enables analysis tools, IDE integration, refactoring tools, experimentation and research, project-specific plugins, C++ transformation tools, etc. etc. etc.

      I'm a big fan of developer tools and anything that enhances productivity. But I've seen some pretty incredible stuff with Eclipse CDT's gcc integration, and even more incredible stuff with Visual Studio's integration with the Microsoft C++ compiler.
      Neither of them do what they do by integrating with the compiler. CDT has its own C++ "parser," as does MSVC. Neither of them actually parse C++ correctly or completely, either, which is why they both do only very basic things, get stuff wrong a lot, get confused and give up a lot, etc. The tools you're talking about are also very basic: simple code-completion popups and a small handful of refactoring tools, nothing close to what the Java/C# tools offer.

      You'd think that the CDT folks wouldn't be interested in Clang as much as they are if your opinion of their tech was shared by the people who actually develop it.

      In fact, just off the top of my head I know that CodeBlocks, Eclipse CDT, and Qt Creator are all already working on Clang integration. Must be a conspiracy!

      Meanwhile our "C++" will be almost as slow as Java because we aren't using a compiler that has fully explored runtime performance optimization.
      That is sensationalist nonsense. The performance difference is not that large, and again -- the performance will improve. LLVM is very obviously not at its peak of performance. There are still a lot of low-hanging fruit that simply hasn't been picked yet because -- as you mentioned -- the project was basically at 0% just a few short years ago.


      You may not be able to notice the difference between Clang and gcc for Firefox or GNOME, but everything from video processing to scientific applications is extremely performance-sensitive, and slow-performing code can impact schedules by hours, days or even weeks, or require hardware upgrades that wouldn't be necessary otherwise.
      All of which are niche applications. GCC of course still exists, and you retain the option to even do primary development with Clang and then release mode optimized compilers with GCC. I said that repeatedly.

      If you're working on iTunes or a kernel or one of the few scientific apps that the hundreds of millions of regular computer users consume (lawlz), by all means GCC is your primary choice in compiler. For everyone else... either use both, or if you're lazy and only use one, use the one that saves you time and headaches (Clang).

      Now, more than ever, gcc is poised to be able to produce better diagnostics than it has in the past. With the introduction of C++ into gcc's source code, internal APIs are being rewritten in an object-oriented manner, replacing old spaghetti code with a layered architecture that at least belongs in the same discussion as LLVM's architecture, even if LLVM is "even more layered" or "even more well-designed".
      So your reasoning is that it will be easier to rewrite massive chunks of GCC's horribly bad, aging architecture than it is to add some optimization passes to LLVM's clean codebase? Brilliant.

      The point is, while LLVM is trying to catch up to gcc's performance, gcc is trying to catch up to LLVM's usefulness to developers.
      No, it isn't. They're working on diagnostics. They are still outright hostile to ideas like embedding GCC in an IDE, or allowing GCC to output intermediary code, or using GCC inside of Mesa for CL support, allowing the internals to support source-to-source transformations, etc. etc. etc. Literally outright opposed to making it possible, for fear that if GCC did then some evil proprietary company might use GCC's backend for a proprietary frontend or vice versa.

      I disagree, however, that the point of a compiler is to provide good diagnostics. I would rather that the compiler focus on what it does best -- compiling -- and run a different, separate tool that tells me why my code is bad.
      That's fine. That tool must have a C++ parser. That parser is Clang. Tada.

      If Clang DOES fully catch up with gcc on performance of compiled code, that's great -- but I think it unlikely because clang is much more valuable if efforts are concentrated on its diagnostics,
      You're under the impression that the people working on the frontend are the same people working on the backend, and that they're somehow splitting their efforts.

      This is not the case. They are different areas for people to be interested in. There are people whose sole interest is in test case writing, in writing standard libraries, in writing tools, or in writing the support code infrastructure that the frontend and backend both use.

      Ripping out the backend is not going to magically make all the people who are interested or skilled in low-level optimization passes or so on suddenly become interested in writing frontend code, nor is the presence the backend causing a bunch of folks with an interest in the frontend to grudgingly spend time writing backend codegen.

      This is an honest question, because I don't know the answer for certain; but what in the world does Clang have to do with the Linux graphics stack?
      Clang is being used for the OpenCL compilers in Mesa.

      Insulting everyone who uses gcc is just as stupid as insulting everyone who uses LLVM.
      I did no such thing. I insulted people who irrationally deride Clang for no reason. The set of people who use GCC and the set of people who hate Clang are not one and the same.

      Maybe you have it over there and can find it for me, elanthis?
      it turns out that making predictions based on technology trends and following what projects are doing is actually pretty easy. You obviously disagree, but then you're also obviously not following the projects anywhere besides the little articles on Phoronix (and even then not that closely, apparently, if you missed Clang's presence in Mesa).

      I bet you also are stumped how stock brokers manage to make money rather than having an even 50/50 split in returns vs losses, huh?

      In any event, _tons_ of companies are investing heavily in Clang. Only a handful are investing in GCC. If nothing else, the manpower is going to slowly shift from GCC's aging developers to Clang's younger energized developers.




      The future is not too hard to figure out if you pay attention instead of being defensive and emotionally attached to a freaking piece of software. There are surprisingly few surprises to those who stay informed and on top of the state of things. There is not going to be some point in time that suddenly the world goes upside down and suddenly one software package is better than the other without months and years of foretelling human endeavors leading up to that moment.

      Not that I should be surprised; emotional attachment to software just because "it's what Linux already uses" seems to be the rule rather than the exception on Phoronix. Maybe it has something to do with nerds not being sports fans? If you're not rooting for a sports team, I guess the human psychology needs to pick _something_ to irrationally favor and cheer on because that's what we do, so nerds do so with software. Patriotism, team spirit, school spirit, bloods or crips, bud light or coors, software fanboyism... yay humanity.
      Last edited by elanthis; 08-18-2012, 01:59 PM.

      Comment


      • #18
        elanthis, what is your opinion about compiling Linux kernel with clang/llvm? Is it worth the effort to make clean clang/llvm compiled Linux distro?

        Comment


        • #19
          Sigh.

          Originally posted by elanthis View Post
          1) A year ago Clang/LLVM was nowhere close to competing with GCC. Now it not only is caught up on some benchmarks, but actually ahead on others.
          Assuming you are speaking of the 7-zip compression test, it's totally pointless as it doesn't set any optimization at all, meaning GCC defaults to -O0, which means no optimization at all, which is used for debugging purposes. I have posted tests (with script) here before which tests both 7-zip's builtin synthetic benchmark and a real-life test using -O1 to -O3 and Clang/LLVM hasn't gotten closer to GCC at all. Again this Phoronix test shows nothing by Micheal's total cluelessness when it comes to compiler testing, almost as bad as testing encoders with assembly optimizations enabled when comparing compiler efficiency, I mean wtf?

          Originally posted by elanthis View Post
          And yet some doofuses want to make claims like "it will never catch up." Currently behind, yes, but it's improving at a much faster rate than GCC does.
          Based upon what statistics?

          Originally posted by elanthis View Post
          2) GCC's internals suck. Even many of the people who work on GCC agree that it sucks.
          Show me the posts where the GCC developers say the 'internals suck'. That they can be better or that there is something else with better internals does not equal 'suck' unless you are a troll with an agenda, like you.

          Originally posted by elanthis View Post
          Companies like Facebook and Google are moving to Clang because GCC costs them far more money to develop with, even if they do perhaps use GCC for release mode compiles.
          Enough with the bullshit, I have no idea or interest in what Facebook does, but Google is not 'moving' to Clang, they are using and developing for both projects, they have full-time employed developers working only on GCC like Diego Novillo, they added and are maintaining Go support for GCC (Ian Lance Taylor), they have other guys working on getting the Address-Sanitizer into GCC trunk (90%) done, and there are other GCC based projects from Google.

          Originally posted by elanthis View Post
          Clang has tool support. GCC does not, and probably never will, _by design_.
          Clang only has tool support on OSX last I checked, and it's proprietary and ONLY able to run OSX. And afaik the GCC plugin architecture exposes the internal GCC api you need to integrate your <insert tool here> with GCC. An example would be the GCC Python Plugin.

          Originally posted by elanthis View Post
          Nobody really gives a crap if LLVM compiles code that runs faster.
          Like f*ck they don't

          Originally posted by elanthis View Post
          but only by small margins that really just don't matter.
          To you perhaps (although likely you are just saying it doesn't matter because you'd say anything to defend you notion that LLVM is a gift from god) but for tons of others a difference of 10-20% performance is NOT neglible and DOES matter, because we are doing more taxing things with our computers than running an IDE or notepad.

          Originally posted by elanthis View Post
          Clang offers significantly better error diagnostics. Yes, this matters.
          Yes this is where Clang really shines from a user standpoint and it is the main reason I have Clang/LLVM in my arsenal together with GCC, it's probably best in class in this respect (I have very little experience with commercial compilers outside of ICC) and it's deserved.

          GCC devs are targeting this area for improvement while fully acknowledging Clang/LLVM :
          http://gcc.gnu.org/wiki/cauldron2012...near.29_future

          Originally posted by elanthis View Post
          and a very significant portion of the professionals who actually use compilers in advanced scenarios (rather than cluelessly debating them on forums and occasionally compiling something) are migrating from GCC to Clang.
          Please stop trying so hard to paint this picture of developers abandoning GCC with your anecdotal 'evidence' which consists of your claims of what is happening 'on forums'. The reality is that despite your claims of the contrary, GCC is going as strong as ever, which is good for everyone except crazy fanboy/zelots like you and others in this thread, as choice is a GOOD THING, and competition is a GOOD THING.

          Originally posted by elanthis View Post
          It will be some time before it supplants GCC on Linux, if it ever does, but frankly who cares? "Is default compiler on Linux distros" is about as important as "is default desktop background."
          Or just about as important as the 'default compiler on OSX'. Problem with Clang/LLVM in this respect is that currently it doesn't even compile the Linux kernel and it's unlikely it will as it demands a concentrated effort (just as it was with getting Clang/LLVM compiling FreeBSD, requiring changes both in FreeBSD and Clang/LLVM) where it seems there is no real effort being made. So no, I don't think it will ever supplant GCC on Linux, but also like you said it really has no large impact as it can be very useful outside of the actual kernel.

          Comment


          • #20
            Sigh. This isn't worth arguing ad nauseum with people who have zero investment in any of it. I give up. GCC was the bestest compiler ever 3 days before the first line of code was written for it, and it will be the bestest compiler ever until the very end of the human race. RMS is a God among men. Linux as it exists right now today will never ever change because it is the most perfectest OS ever, and change is evil and all change is a plan by Microsoft to subvert and destroy Freedom. God bless the GPL.

            Comment


            • #21
              Originally posted by elanthis View Post
              Sigh. This isn't worth arguing ad nauseum with people who have zero investment in any of it. I give up. GCC was the bestest compiler ever 3 days before the first line of code was written for it, and it will be the bestest compiler ever until the very end of the human race. RMS is a God among men. Linux as it exists right now today will never ever change because it is the most perfectest OS ever, and change is evil and all change is a plan by Microsoft to subvert and destroy Freedom. God bless the GPL.
              Yes, do the strawman, because it's not as if I haven't constantly been advocating the great thing of having BOTH compilers while acknowledging the strengths and weaknesses of BOTH compilers while you have done nothing but trashtalk GCC and yet hilariously you've complained about other totally biased people in this thread.

              Zero investment? At work we pretty much exclusively use GCC and Clang/LLVM, so I use them EVERY DAY except when I'm on vacation (which I'm just back from, sure could have done with a few more weeks boss!).

              Wth does Linux never changing come from? I'm sure it's the fastest developing kernel out there bar none. If you are alluding to it changing from GPL (as you brought it up in conjunction) then that's probably as likely as Microsoft GPL licencing Windows.

              Bottom line, GCC and Clang/LLVM are great compiler toolchains, they have different strengths/weaknesses. They are both also developed at a very high pace, something which is likely fueled by the competition they offer eachother. This is awesome for all of us using these compiler toolchains, no matter if we use both or just one of them.

              There are zelots on each side of the fence (you are one of them) who refuse to acknowledge that they both offer great value and instead try to discredit the one they don't like as toy/bloated/slow crap/etc etc, in my opinion the reasons for these attacks generally have nothing to do with the compilers themselves but are rather results of some zelots identifying them as GPL vs BSD and not being able to look past that.

              Comment


              • #22
                Zealot is actually spelled with an "a", by the way.

                Comment


                • #23
                  Compiler benchmarks are meaningless unless you use proper CFLAGS. As far as I can tell, proper CFLAGS were not set.

                  Whenever one of these are published, "helpful" individuals think that the benchmarks explain everything that they need to know. They then start contacting people who develop distributions. Then we have to take time that we would have spent programming to explain that the benchmark numbers are meaningless. Publishing meaningless benchmarks is a great way to undermine open source development. Each publication wastes developer time.
                  Last edited by ryao; 08-27-2012, 02:34 AM.

                  Comment

                  Working...
                  X