Announcement

Collapse
No announcement yet.

GCC's Conversion To Git Is Being Held Up By RAM, a.k.a. Crazy DDR4 Prices

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by carewolf View Post
    If memory performance is so important the slower memory of cloud servers wont do, swapping is not going to cut it even remotely.
    Ok. But why does it matter whether it takes 8 hours to complete, or 48 hours to complete? Time = money. If you cannot afford the RAM you'd like, then just live with it taking a little longer to run. Why do we care about performance for a *one-time* import event?

    Comment


    • #42
      Originally posted by theriddick View Post
      I'm guessing this developer is not very rich, or in fact very poor. Happens.

      If I was earning $162hr or whatever it was, I would have bought a super computer and be done with it all! LMAO
      Nope not a supercomputer, if you allow me to nitpick. A supercomputer is like a cluster but better integrated i.e. several computers with expensive very fast networking and a custom OS (why they all run linux). Here if you have money you can buy something like a single-socket quad core or six core Xeon with 512GB RAM.
      We have RAM shortages though and the most ridiculous is Apple $2500 netbooks with 16GB most of which is used for the OS and browser.

      Comment


      • #43
        "64 GB of RAM ought to be enough for anybody" - Bill Gates

        Comment


        • #44
          Originally posted by milkylainen View Post
          I don't get how this can be an issue. Like for real?
          I'm betting there would atleast be a bucketload of instances in the industry willing to throw a big ass server in his general direction.
          Further proof ESR is a fucking tool. can we ditch this guy already

          Comment


          • #45
            I don't get what the fuck is going on here. From the article it seems this is a one-time conversion but from what he said in the mails he seems to talk about regression testing and other stuff like that on the source, maybe this is something that requires more frequent testing?

            Comment


            • #46
              AMD should throw a Threadripper System to him.

              Comment


              • #47
                He is using Python which is known to consume horrendous amount of memory(people porting to C++/Rust usually get x20-x50 memory consumption reductions) and he says "bleeding edge of what conventional tools ... can handle"...

                Comment


                • #48
                  I'm astounded at the amassed expertise in this thread. Surely by tomorrow or Friday at the latest I will see this problem solved. Thanks guys!

                  Comment


                  • #49
                    Originally posted by AnonymousCoward View Post
                    I'm astounded at the amassed expertise in this thread. Surely by tomorrow or Friday at the latest I will see this problem solved. Thanks guys!
                    Your irony is overwhelming.
                    If you're not willing to try compressed swap, changing your tools OR hardware, bitching about RAM prices might seem like a reasonable solution.
                    I'd go with the hardware solution.
                    An EPYC E-ATX with space for more sticks. Move the existing sticks, buy extra. Gigabyte has a 16-slot single socket EPYC motherboard.

                    Probably less costly than spending weeks rewriting all the tools into something that isn't a memory hog to begin with.
                    We ARE being constructive. You just fail to realize it.

                    Comment


                    • #50
                      instead of asking for donations esr should use proper language instead of python and learn how to write programs without ridiculous memory requirements

                      Comment

                      Working...
                      X