People keep telling this. It's still wrong. Efficient programming would be better than just adding more and more RAM. There are still prices for memory and sometimes limitations (architecture, embedded systems, older RAM types and so on). Furthermore things tend to crash, become slow and show other signs of nastiness when they get too bulky. Browsers, Java things, GFX programs, whatsnot.
I have grown up with DOS, W 3.11 and I (sadly) know what swapping means.
> also the worst part about java are bad programmers, not java itself.
That is very likely true. But basically everything suffers from bad programming. Even some languages might have suffered from design that wasn't tested thoroughly.
Still, in terms of raw performance, Java is a VM and adds (at least) one more layer slowing down things. On the other hand it is comfortable in certain aspects like portability and taking the need of caring much about HW or OS from the coder.
Stop TCPA, stupid software patents and corrupt politicians!
RAM might be chip, but cache is not, and it is also fixed to a given CPU, you can't just "add more cache". Bad memory usage is bad cache usage, too. The less you are able to leverage your cache, the poorer your performance becomes.
as if memory was so expensive these days ...
also the worst part about java are bad programmers, not java itself.
Originally Posted by http://landley.net/notes.html#09-12-2013
My netbook finally needed a reboot today. First time since June. I drove it deep enough into swap that an hour later it still hadn't let me move the mouse pointer. I did this by right click open in background tab on three different links in chrome, and then when it spent five minutes thrashing doing ctrl-alt-F1 to try to get a text console so I could do the "ps ax | grep flash; kill flashpid" dance. Unfortunately, these days that's no longer handled by the kernel but instead handled by X11 going through the whole desktop stack (including the gnome crap that xfce pulls in for no apparent reason), meaning it just added MORE memory pressure, and the poor little netbook with only eight gigabytes of ram went catatonic with swapping.
You'd think the out of memory killer would trigger during this, but no, it hadn't run out of SWAP. The auto-partitioning when I installed this thing gave it 4 gigs of swap, it can churn through that for days before deciding it's out of memory and it has to kill process. Freezing and being unresponsive for hours is much better than killing _processes_.
I don't think I'm going to miss current Linux userspace when smartphones leave it behind. It used to be happy in 16 megs of ram; that was a huge box. Now it's cramped in 7,714 megs of ram. Progress!
I haven't followed the link, but I really really doubt that his problem comes from 8Gb+ of executable code. What's filling the RAM is data (cache of pictures, compilation intermediate data, cache of databases, etc...).
When you have 8Gb of ram usage, an exe of 5Mo or 50Mo doesn't change much.
What we have here is "bad resource prioritization by the OS", but not "not enough resources".