I've always wondered why ATI/AMD doesn't just hire an additional five developers for OSS development. I'd assume that it would take 6 months of training to get them to the point where they could produce something useful, but we'd see real results by the end of a year, and have a performant replacement for Catalyst in two years.
What's the deal with that? $750k buys a team for two years. Does the revenue from Linux-related-sales not justify the cost? (admittedly, I have no idea how much of AMDs revenue is generated via linux-related sales, no do I understand how your SD org is run). I do know that disappointed customers are far less likely to make subsequent purchases, so this is probably something that should have been done a couple years ago, when gallium was coming about.
On a slightly related note, I'm a bit disheartened to see everyone working so hard on legacy technology. I really though that we would all have 10-bit/chan monitors by now. I really thought that we would all have ray-tracing now. I really thought that 'everyone' would be able to play back a 1080p Main-Profile H264 file by now. Even if I had one of the dozen 10-bit/chan panels on the market, I doubt that I'd be able to drive the thing with X/Mesa (I could be totally wrong). I don't want to diminish the efforts of everyone working radeon, but when the next CG generation or innovation becomes mainstream, we're going to be back at the starting line again.
What a strange world we live in.