On my system I created a *tar file of pdf files. Some of the pdf files are text, but most of them are mainly images. Since they are already compressed quite a bit then you don't benefit from it a whole lot.
This compresses at about 60-70 MB/s and decompresses at close to 400MB/s. The size saved is small, however. I am only saving 10-20 MB in 246MB. So using compression on something like that is not worth it.
Meanwhile I used dd it create a 573MB file full of zeros. That compresses down to just over 2.3MB and takes 3/4 of a second.
On the other end of a spectrum a 573MB file made using '/dev/urandom' takes almost 10 seconds to compress and is actually slightly larger afterwards.
So even on very fast SSDs it MAY be worth it. If you care more about read speeds it may help. If you care about random access it may hurt.
It also heavily depends on how smart the system is about using compression.
It's stupid to compress jpg files or gif files or any sort of common multimedia files. They are already compressed heavily using specialized algorithms and it's extremely unlikely that lzo is going to help things any. But text files, documentation, program files, and many other can benefit from the compression. Some databases may benefit also, but you would think that if they did they would already be using compression internally.