I don't get it, does it affect ATI/AMD every card, or just cards < r600 ?
The only games I have that impose a significant OpenGL load are Scorched3d, Critter (load due to high framerate) and 0ad(mostly CPU limited except at the very start). I don't have the games usually benchmarked, so a kernel regression not affecting these games could be present and I would not see it
xorg is a poor graphic server. how long for wayland!? regressiona are unnerving.
here is a tutorial . It's for the kernel, not X, but the commands would be the same.
For my own purposes it is enough to run the xserver used by Ubuntu 13.10 with current Mesa from the Oibaf PPA and the current kernel from PPA. For Canonical, however, they need to either get this fixed upstream, patch it themselves, or do as I am doing and reuse the older X server. Otherwise an LTS version of Ubuntu (14.04) will appear where proprietary drivers give no more performance than Mesa did a year ago, and Mesa performance is half what is used to be. Debian will get this too-as will any Steamboxes built on that version. I can't imagine they will be dumb enough to let this happen now that they have been made aware that the regression exists. Possible they might miss it here on Phoronix.
This is a guess, but since the first version of X exhibiting the terrible performance was the first version to use DRI3, that might be related to the problem. On the other hand, going to a non-compositing window manager did not help at all, don't know if that has any bearing on it.