Google doesn't need perfect data integrity or hardware quality at that. By the volumes of data they process and the volume of hardware they deal with they don't care if a server is of iffy quality, they toss it. Large data volumes might also suggest a faster yet more data error prone file system might fit them ROI wise best.
Like I asked you, do you use ECC in all your computers in all situations? Do you slightly overvolt/over clock/ underclock any part of your computer,back up on tape,backup on 10yr DVDs? The point is, you personally make trade offs on data integrity on a daily basis.
Gee I don't get it! Ohhh how can someone/corporation live in a world without 100% data integrity?!!