Thanks for the free lecture, though.
I distinctly remember a couple of CUDA-based x264 encoders being hyped around last spring / summer, with numbers like "40% faster than the faster quad core" being thrown around. I haven't really followed the news, but a quick google search reveals that the actual products don't live up to the hype.No current GPU can come close to a good CPU running x264. The x264 devs have said that there are some small parts of the encoding process that could benefit from the GPU, but it isn't much. Unfortunately, most (all?) of the magazine and e-mag benchmarks have used a very bad CPU encoder like Apple's, an ancient build of x264, or very crazy settings.
GPU decoding is what is what will really be nice -- not encoding, sadly. Unless you're talking about advanced post-processing before encoding, as the GPU can help a ton with FFT noise/grain removal.
I still see potential in GPU-assisted encoding, however. Low-end machines will probably benefit more than high-end ones here (the latter can already handle x264 encoding in real time).
Other than that, you need to remember that we are just at the beginning of the GPGPU era. This is more or less new ground: the necessary tools are only now becoming available (no, Brook and CUDA aren't not nearly good enough for wide-spread adaptation; Compute shaders and OpenCL on the other hand, are.)