You appear to be confused about the useful time frame of hardware. The HD5000 is for bleeding edge early adoptors around now. The vast majority of consumers are still running HD2000 through HD4000 hardware, if not 9800 class chips.It is one thing to use your hardware, it is another thing to use it within it's useful life time.
This is no different than the OpenGL 4.1 threads. OpenGL 4.1 is put out now so that it can be in use 2-3 years from now. If they waited 2-3 years before releasing it, then it wouldn't be in real use until 4-6 years from now.
Perhaps you bought brand new top-of-the-line hardware and are miffed it isn't supported. I was miffed when my home desktop's HD4000 series card wasn't supported, too. But to claim that my HD4770 is "past its useful time frame" is just freaking idiotic.
According to Steam, all of 3% of users are even using HD5000 series hardware right now, and it's 9 months old. The most popular ATI series is still the HD4800. The most popular NVIDIA card is the 8800, which is now five generations old (it being followed by the 9x00 series, GT100 series, GT200 series, and now GT400 series).
The lag in drive support from ATI sucks for early adopters, but it is by no means resulting in only useless hardware being supported.
But hey, why use facts and statistics and reasoning when you can make grandiose claims and make your post seem more dramatic and meaningful?