Announcement

Collapse
No announcement yet.

GCC's Conversion To Git: "Within The Realm Of The Practically Achievable"

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by ESR in 2018
    Nobody knows how to build a machine that can crank a single process enough faster than 1.3GHz. And the problem doesn't parallelize.
    how exactly does having a 16-core (or 32-core) TR help with that?

    Comment


    • #32
      Originally posted by L_A_G View Post
      Because they want to move the whole version history over to Git too. Most projects are much more pragmatic and chose some fairly recent stable release as the starting point and then re-apply patches and split off branches from there. The GCC project want to bring in the whole SVN version history, branches and all, starting from the early 2000s which means that we're talking about over 15 years of version history for a very large and active project.
      Yep, and one of the problems is that they've abused SVN significantly, with multiple improper branches that copy the entire repository (according to SVN meta-data), more than one complete deletion and then recreation of a branch (AT&T or IBM at fault here I believe) and a number of other terrible accidents or practices over the years. This means that the standard git-svn bridge that works in 80% of cases just falls over itself when you try to use it on the GCC repository. Either you end up with disjointed histories from the complete copies of the repo, broken branch histories, or a number of other problems that they want to fix during the conversion.

      Comment


      • #33
        Originally posted by simcop2387 View Post

        Yep, and one of the problems is that they've abused SVN significantly, with multiple improper branches that copy the entire repository (according to SVN meta-data), more than one complete deletion and then recreation of a branch (AT&T or IBM at fault here I believe) and a number of other terrible accidents or practices over the years. This means that the standard git-svn bridge that works in 80% of cases just falls over itself when you try to use it on the GCC repository. Either you end up with disjointed histories from the complete copies of the repo, broken branch histories, or a number of other problems that they want to fix during the conversion.
        Still, you'd think it'd be a more architecturally reasonable approach to produce a repaired Subversion repository and then go from there, rather than trying to do it in one big jump.

        Comment


        • #34
          Sometimes I wonder if Linux survives because of Stallman, Torvalds and Raymond or in spite of Stallman, Torvalds and Raymond.

          Comment


          • #35
            Originally posted by Venemo View Post
            Why is this reposurgeon tool needed at all? There have been other projects that migrated to git just fine, without the need for special stuff.
            This is all at ESR's insistence. Then he decided that it wasn't his code that was poor, but the Python language he was writing it in that was poor. He made a post showing very little knowledge of the internals of Python and the workings of some core modules, declared he was rewriting everything inn Go, and predicted that Python would be dead in two years. Meanwhile, IEEE has since declared it the most popular language for two years running. :-)

            Now his Go version doesn't seem to be doing the job even with massive hardware. Others tried to tell him his there was something wrong with his approach since they've done this with codebases that were at least as large and it didn't take anywhere near as long, but he continued to insist he was right and it was all Python's fault. It's madness.

            Comment


            • #36
              Originally posted by perpetually high View Post
              The solution wasn't just throw more hardware at it, but also required porting the tool from python -> Go. Now that he has both, he can resume the conversion. So I guess my question is, what exactly did I miss?
              You missed the part where others explained to him in his original post that they've migrated codebases just as large with Python or even Javascript and didn't have his problems and tried to explain to him that there was something wrong with how he was doing it, but he stubbornly insisted he was right. I even argued with him a bit myself but it was useless. Remember this is the man who insists there is no inequality in the tech field and if there are less black people employed than expected it must be because that race has an inferior IQ (yes, he really said this). Now imagine trying to explain to a man like that that the approach his program is using is wrong.

              Comment


              • #37
                Originally posted by alcalde View Post
                Sometimes I wonder if Linux survives because of Stallman, Torvalds and Raymond or in spite of Stallman, Torvalds and Raymond.
                because of Torvald's sanity, against Stallman's GPL3 efforts and independent of any of Raymond's developments.

                Comment


                • #38
                  Originally posted by simcop2387 View Post

                  Yep, and one of the problems is that they've abused SVN significantly, with multiple improper branches that copy the entire repository (according to SVN meta-data), more than one complete deletion and then recreation of a branch (AT&T or IBM at fault here I believe) and a number of other terrible accidents or practices over the years. This means that the standard git-svn bridge that works in 80% of cases just falls over itself when you try to use it on the GCC repository. Either you end up with disjointed histories from the complete copies of the repo, broken branch histories, or a number of other problems that they want to fix during the conversion.
                  If there were only a handful of those f-ups, you could fix them manually. No need for tools that require 384 GB of RAM.

                  Comment


                  • #39
                    Do the people that maintain gcc not know how to write code? Also, there are existing utilities out there that already do this. We ran such a utility at our company that had a few years and 1mm+ lines of code. It ran on a 4790k with 8 gigs of RAM. I realize that GCC is pretty old, but this should be a process where you run and be done.

                    Comment


                    • #40
                      There's some lack of humanity in this thread.. as if ESR didn't go through his own personal troubles recently, and not only that. No individual should be blamed if the project has not been taken over by others in the meantime.. that only means the whole transition to git is not really considered vital by the community to begin with. This sounds more like ESR's personal long-term project that will be released when it's ready, no more no less. But then you proceeded with bashing on him, basically implying he's just a lazy old fart that has nothing better to do than complaining about not having a supercomputer at home. I mean.. really, any people should be allowed to work at their own pace. This is probably all he can do right now and that's all he could do in the last few years for you. You can't just pull the trigger against his head assuming he's being dishonest in a way or another. Most of what you said in this thread is so cynical and shameful it makes me wonder how bad your day was and what is really causing your frustration.

                      Comment

                      Working...
                      X