No announcement yet.

Which applications benefit from 4GB or more?

  • Filter
  • Time
  • Show
Clear All
new posts

  • Which applications benefit from 4GB or more?

    Which applications benefit from 4GB or more in RAM? I can imagine databases, servers, and video editing. But what else?

    I cannot remember seeing any benchmarks where going from 2GB to 4GB would give you any benefit (possibly except in Vista). But, I could be mistaken. I run 64-bits so 4GB wouldn't be an issue.

    Yes, RAM is "cheap" now, but I would like to have some hard evidence before I shell out $30 for another 2GB for my home machine...

  • #2
    Windows maybe? That seems to be the consensus among the people I've heard selling (and justifying buying) RAM these days

    Personally I'm happy with 2GB and don't notice any heavy swap activity.

    If you're not doing any multimedia editing, the only other legitimate use for a large amount memory that I can think of is virtualisation (running one or more Xen/VBox/KVM etc. guests on your computer).

    Some games may benefit from more memory, but I doubt you'd notice the difference even with most modern games.


    • #3
      Things that benefit from more memory? Heavy multitasking, compiling, and Firefox. That said, I get by with 2GB, though I'll admit to the temptation of having more.


      • #4
        EDA (e.g. Xilinx/Altera tools, silicon compilers/simulators, etc.) and numerical computing (e.g. MATLAB/Octave) come to mind as likely candidates, although non-engineers probably don't care about such things. Ultimately it depends on the size of your design or data sets, but the footprint can grow pretty fast.


        • #5
          For normal Linux desktop use 2GB is really enough. 4GB is a bit overkill in my opinion. Despite x86_64 using more memory by default, I rarely use more than 2GB on a single desktop session.


          • #6
            I've put 6GB in my PC and made 4GB of it a tmpfs partition (like a ramdisk) :P


            • #7
              Extra memory helps if your use scenario includes large total size of frequently used files. Unless you have very fast raid array to make read latency acceptable.

              So my idea of what memory is required for low latency:
              surfing/mail 256M (minimal mem: xfce, midori, etc)
              -,,- 512M (gnome, kde, win xp)
              office work/gaming 1G
              high end gaming 2G
              Heavy computer work 4+G (graphics, engineering, data mining, software development, etc)

              Of course you can squeeze limits down if you know a bit more about computers and where to find light applications to save memory. But managing low memory system takes time each day ... time is money and memory is cheaper.

              PS. I don't even have 1G memory but maybe soon. My computer is about retire soon as I'm currently running dd_rescue /dev/sda2 - | ssh server.local "gzip > sda2_recover.img.gz"


              • #8
                Originally posted by suokko View Post
                Heavy computer work 4+G (graphics, engineering, data mining, software development, etc)

                Ok, I found the argument for me, here at Phoronix -

                Kernel compilation!


                • #9
                  Compiling does not need much memory, 1 GB would be enough for that task. VM are consuming much memory.


                  • #10
                    Swap usage doesn't really tell you much with linux. Most programs that can or do or want to use large amounts of ram usually manage memory themselves. Swaps really just a air bag for when things get totally out of control.

                    Compiling uses huge amounts of ram. Unfortunately I'm no linux guru I just know enough to know that until you put so little ram in a linux computer that it has no choice it won't use swap and if you put a ton of ram in a linux system you have to tweak some programs to get them to use it.

                    Don't try to run virtual machines without big ram. THAT I KNOW. Wine will even pig out on ram pretty good if it can and in standard linux fashion sip it sparingly if it can't.

                    I can tell ya though. Ram is cheap. Buy it cause if we lose one more big ram supplier that will stop.


                    • #11
                      Originally posted by Hephasteus View Post
                      ...until you put so little ram in a linux computer that it has no choice it won't use swap...
                      As I recall, this isn't quite true...from what I've read and experienced, the kernel will swap out old pages for its caching strategy even if some memory is left.


                      • #12
                        That seems to be true and the reason why you should use a swap partition even with plenty of RAM (though in this case that partition can be very small, as long at it's there). There will be a little bit of swap used for optimization purposes even if you had something like a 500GB RAM machine.


                        • #13
                          It's hard to figure out. I remember working on systems with sco unix years ago that used swap alot but those where the 4mb 8mb ram days. I can't remember if swap got used much with redhat 7 which i ran on very old hardware but since I've been running fedora from 4 to 10 I've never seen swap used with 1gb 1.5gb 4gb. My old system got flakey with dual channels and I had to run some with single channel here and there and I think it swapped a bit when I ran 512mb. Reguardless linux really busts it's butt trying to stay within 512mb from what I've seen with it. It keeps active stuff from about 360 mb to 490 mb most of the time on my system. But wine and VM's throw alot of that out the door.
                          But even vista doesn't need much ram if you run it in 2d graphics. So it seems that heavy ram usage won't come till we see heavy 3d usage.

                          Still like to understand that all better. It just seems so modularized and there's so much sharing and optimizing of the sharing profiles it's very hard to nail down. It all works like a pool and it seems so good at it's job that increasing the pool has little effect.

                          My gut says 512mb is where it's at for pure linux. 1gb for pure linux plus wine and 512mb plus whatever size you set for each VM for virtualization.

                          I played everquest on PEQ server while back and the server for that thing is very modest with only 2gb of ram. It never swaps but my gut tells me it could use tons more ram but never could figure out what perl 5.0 was doing well enough to suggest configuration changes to change it's pool profile. It would rather monkey around with the sharing pool than use more ram. The monkeying around would cause it to ignore the network so long that it dropped dozens of users at a time.


                          • #14
                            Originally posted by Hephasteus View Post
                            Ram is cheap. Buy it cause if we lose one more big ram supplier that will stop.
                            Ok, that was the argument I needed...

                            $35 for 2GB is not bad.

                            Oh, now I notice! There are 4 slots, which handle 2GB each!!! And, I have 2 free! I had forgottened.

                            So, my next thread would obviously be, Wouldn't 8GB RAM be worth an additional $70?

                            Did the new 2GB help or make any difference? Well, I do think Google Earth load faster, even when I had like six, seven or eight Iceweasel windows open, and playing Amarok, and updating in the background, and smiling at the System Monitor. It might be a psychological effect of sorts only, but my smile outweighs that.



                            • #15
                              I recently upgraded from 4 to 8GB of RAM, simpy because we've hit the bottom of DDR2 prices (they can only go up from here and, indeed, they are slowly starting to).

                              Did I notice a difference? Only in three cases: a) exporting a ridiculously complex model from blender to some other format (the python exporter allocated something close to 4.5GB of memory!), b) increasing memory limits in GIMP so you can work with 16+MPixel images in GIMP and c) running >3 VMs at the same time.

                              For everything else, 8GB is overkill.