Announcement

Collapse
No announcement yet.

It Looks Like GCC's Long-Awaited Git Conversion Could Happen This Weekend

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • It Looks Like GCC's Long-Awaited Git Conversion Could Happen This Weekend

    Phoronix: It Looks Like GCC's Long-Awaited Git Conversion Could Happen This Weekend

    The long in development process of converting GCC's SVN repository to Git for using this modern distributed revision control system for developing the GNU Compiler Collection in the 2020s may finally be complete in the days ahead...

    http://www.phoronix.com/scan.php?pag...y-This-Weekend

  • L_A_G
    replied
    Originally posted by smitty3268 View Post
    Nobody cared 2 years ago. About 2 months ago they decided they needed to get it done and a lot of manpower got assigned.
    Well it's kind of obvious that they didn't want to see it done so they assigned it to that oaf and I doubt they finally assigned actually competent people to get it done for any other reason than them realizing that the optics were so bad that it looked like a Dilbert strip with Wally as the main character.

    Leave a comment:


  • ermo
    replied
    Just saw this post by Maxim on the GCC ML:

    I was wrong re. r182541, I didn't notice that it is the first commit on branch. This renders the analysis in favor of reposurgeon conversion, not svn-git.

    -- Maxim Kuvyrkov
    https://www.linaro.org
    I have a lot of respect for people who simply own up to any errors/mistakes they've made and change their opinion accordingly.

    Thank You Maxim; both for the work you've done on keeping the resposurgeon effort honest and for the independent verification you've spent so much time doing for the past 6 months on your own conversion.

    Leave a comment:


  • xorbe
    replied
    It's really important to keep all history in a repo like this. Sometimes it's the only clue as to why something is the way it is. Worked on a project like that once ... bug fixes were like archeological historical research quite often.

    Leave a comment:


  • caligula
    replied
    Originally posted by L_A_G View Post
    This has to be fake news! No way ESR could get something that big done this quick! He's not procrastinated nearly enough so this has got to be an out-of-season April fool's joke or ESR being replaced by a shapeshifting alien or a pod person.
    Maybe he got a free new quad-socket 128 core EPYC machine from AMD.

    Leave a comment:


  • smitty3268
    replied
    Originally posted by L_A_G View Post

    If they actually meant to get it done, then why was it delegated to ESR in the first place? Assigning projects to people who are lazy and/or incompetent is a well known strategy used by managers who wan't to kill a project, but don't want to do it outright.
    Nobody cared 2 years ago. About 2 months ago they decided they needed to get it done and a lot of manpower got assigned.

    Leave a comment:


  • Teggs
    replied
    Originally posted by CommunityMember View Post

    ...real server class hardware and not in someone's basement.
    To be fair, Michael has a 7742 EPYC server in his basement.

    Leave a comment:


  • pal666
    replied
    Originally posted by Blahblah View Post
    simultaneously pretending lost commit history is just an unimportant detail.
    history isn't lost, you have your readonly svn clone for all eternity

    Leave a comment:


  • ermo
    replied
    Originally posted by CommunityMember View Post

    For some time the code/history conversion was more or less equivalent and complete between the two leading alternatives.

    Attribution and references (sometimes needing to parse the changelogs or code itself, for such things like determining authors rather than just committers and extracting the bug/issue/enhancement references and matching multiple email addresses to the same person) was the long tail, with what are now some better heuristics, but more importantly a lot of domain specific adjustments (i.e. one off fixes), to the conversion process for the gcc use case.

    No one is so arrogant (well, there is always one) to believe there are still not some artifacts if you spent the time to look really closely, but they are of a small enough number that no one is willing to spend another month trying to address them, and then another month trying to address an even smaller number, and then another month.....
    Maxim's conversion was apparently a very useful reference for tweaking the heuristics in reposurgeon by helping to flag some overly conservative lifts in reposurgeons GCC-specific fixups, thus leading to improved attributions. Whether or not this trip down the long tail of history lane was worth it is not for me to judge.

    As for artifacts, well, the threshold for diminishing returns has clearly been reached. AIU Joseph Myers ended up doing 15-20 test conversions for people to comment on and compare to Maxim's conversions (which were also updated several times in response to issues found by the reposurgeon people). Apparently, there were fixed costs of roughly 4-5 hours spent in git alone IIRC, so each test repo conversion took at least 5 hours on whatever machine it was that Joseph used.

    Here's hoping things go smoothly over the weekend...
    Last edited by ermo; 01-09-2020, 03:59 PM.

    Leave a comment:


  • L_A_G
    replied
    Originally posted by CommunityMember View Post
    Experienced developers joined the effort. It should surprise no one that actual talent can git-r-done.
    If they actually meant to get it done, then why was it delegated to ESR in the first place? Assigning projects to people who are lazy and/or incompetent is a well known strategy used by managers who wan't to kill a project, but don't want to do it outright.

    Leave a comment:

Working...
X