Originally posted by Raka555
View Post
There's a reason nobody takes synthetic micro-benchmarks seriously. The reason is that they don't accurately represent the real world. Let me ask you a few simple questions: Let's assume it were faster in the real world not in synthetic land where you can make anything happen, why did both Debian and Gentoo drop the arch? Why didn't Red Hat take this 40% speed up and run with it? Hell... why isn't Clear Linux x32, after all it's designed to draw as much performance out of Linux as possible right? You really think these companies are going to leave a 40% improvement on the table when Intel's CPUs are improving at maybe 5% a year? Seriously?
No. Here let me blow your mind for a second, in the real world most programs spend most of their time waiting on input (whatever the source), so most real applications only care about the race back to idle. Furthermore outside of special situations like games it takes longer to read from the network or from a file, or whatever it's waiting on than to actually process its reaction, and here is something to send you into total shock: in the real world the actual data dwarfs the data structure backing it. All the cutting down on pointers in the world won't save you from that 20MB PNG or HTML file you just mapped into memory,
Comment