Announcement

Collapse
No announcement yet.

The GCC Git Conversion Heats Up With Hopes Of Converting Over The Holidays

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • The GCC Git Conversion Heats Up With Hopes Of Converting Over The Holidays

    Phoronix: The GCC Git Conversion Heats Up With Hopes Of Converting Over The Holidays

    Decided back at the GNU Tools Cauldron was a timeline to aim converting from Subversion as their default revision control system to Git over the New Year's holiday. For that to happen, by the middle of December they wanted to decide what conversion system to use for bringing all their SVN commits to Git. As such, now it's heating up ahead of that decision...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    He should have waited till Black Friday 2019. The RAM prices are now insanely cheap and the new Threadripper generation is available. But you'll only get 1024 GB of RAM. It's much less than the largest servers with petabytes of RAM.

    Comment


    • #3
      man this is so funny. is he afraid that Maxim will prove that there was no need for asking people for a new computer.

      Comment


      • #4
        Anyone else wonder why this has taken so long? Let's say the current tools aren't any good

        SVN stores data in either Berkley Database format or via Flat files.

        BerkleyDB has an SQL interface, so querying it for a sequential order isn't too hard and you be able to pull out each "commit" fairly quickly and in sequence. The BerkleyDB is a pain as reading up on the specs you'd need to replicate their hosting environment reasonably closely.

        The FSFS format is a lot of small files. But those files have a commit record and a delta for each file. This is quicker and nicer because you can read the files line by line and the only Ram requirement would be streaming the binary for new objects/binary. The difficulty here is making sure you get the sequence correct, I'd be tempted to have a database with file location/time/etc.. so I could quickly move through the files. Otherwise you would potentially have to iterate over every file each time to find the next.

        ​​​SVN is primarily trunk based development but does support branches. So you'd probably want to create a test setup to make sure you recognise when a branch exists.

        Parsing multi TiB datasets like this is standard data engineering and you never throw a giant machine at it. Better to have a dozen small machines than one Epyc.

        Comment


        • #5
          Oh boy, this ongoing crusade sure is getting hilarious. Dude wasted 2 friggin' years and a couple thousand dollars on this pet project yet still haven't succeeded. Then he came up with the Go port, which he claimed to yield some 40x speed up, which after all ended up being more like 3-4x. But still not fast enough lol. As if we didn't have time. I repeat, it's been 2 YEARS. I mean, I don't even understand a whole argument about speed, is there a time limit? Is there a detonator in his comp that explodes if the conversion isn't done within 12 hours or something? I really don't get it.

          And when someone finally stepped in to put an end to his misery, the best excuse he could come up with is, he doesn't "trust" git-svn? You know, the tool with 13 years of development history and more than 130 contributors? Yet he somehow trusts his own cr@p that's been consistently failing on him?

          I mean, how more arrogant and cocky could you get? What a clown. I mean I totally get it, someone else doing the thing in a few months that you failed to achieve in years sure makes you look like an idiot. But coming up with excuses like this only makes it worse. You're incompetent in any case. Just let it go.

          Comment


          • #6
            anarki2 come on...give the man some space. theres a new threadripper he wants to have. how could he ask for new hardware if he does not have an excuse?!

            Comment


            • #7
              While this might be sad for anyone who gave money to him, it's just getting more and more hilarious for me.

              Comment


              • #8
                I don't know why some of you are so salty or angry about this.

                I just see a guy who wants to carefully and cleanly convert the data and has been working on doing this in the spare time he has available.

                Any cowboy with some basic coding knowledge could do a fast conversion using existing tools or even some homebrew software. However, the result would almost certainly be an imperfect conversion that would leave a trail of confusion and problems for years to come. But that cowboy would have "done his job" and received praise for doing it, so he probably wouldn't care. I've met those kind of coders; they are numerous. There are far fewer coders who are methodical, far-thinking and considerate.

                If I was a GCC dev I'd prefer that this guy takes another 2 years doing this task properly instead of it being done quickly with potential messy fallout.

                Comment


                • #9
                  Originally posted by cybertraveler View Post
                  I don't know why some of you are so salty or angry about this.

                  I just see a guy who wants to carefully and cleanly convert the data and has been working on doing this in the spare time he has available.

                  Any cowboy with some basic coding knowledge could do a fast conversion using existing tools or even some homebrew software. However, the result would almost certainly be an imperfect conversion that would leave a trail of confusion and problems for years to come. But that cowboy would have "done his job" and received praise for doing it, so he probably wouldn't care. I've met those kind of coders; they are numerous. There are far fewer coders who are methodical, far-thinking and considerate.

                  If I was a GCC dev I'd prefer that this guy takes another 2 years doing this task properly instead of it being done quickly with potential messy fallout.
                  It's pretty obvious this is really low priority for him. It's not that he's been working 2 years on this project. It's that he's worked on it for a few days, then abandoned it for a year, then worked on it another couple days, and so on... The last time this topic came up he basically asked for someone else to take it over and finish for him. If you look at the repository he linked to with his import script, he hasn't touched it in a month, and someone else has been committing to it for the last couple of weeks.

                  Yet somehow he always pops up whenever someone else mentions doing this without his tool, and insists that would be impossible and that he's the only one who understands all the complexity of the project.

                  At the end of the day, results speak for themselves. It's time for him to put up or shut up.

                  The alternative repo is now out there. Can he show anything in particular that doesn't work in it? Supposedly he's able to create his own repo. Surely there's a relatively easy way to compare the 2 git repos after the different imports were done. Presumably there will be differences, and the diff between them should be pretty instructive on which tool is doing the better job - or if it matters either way.
                  Last edited by smitty3268; 08 December 2019, 08:08 AM.

                  Comment


                  • #10
                    Is SVN to GIT really that hard? I know GCC is pretty big, but still... If you can export the historic diffs then you can submit them to git in a for loop. If you can't, well then that data is already gonski and you might as well just submit what you *do* have accessible to GitHub/GitLabs, or whatever.

                    Also, if you have a script that works, but you need more RAM, just hire a few cycles in AWS. Heck you get a year compute for free...
                    Last edited by OneTimeShot; 08 December 2019, 09:35 AM.

                    Comment

                    Working...
                    X