Currently, bitcoins are mined by specialized hardware (ASIC - custom made chips) which are order of magnitude faster than GPU (GPUs tend to mine in the mega-hash per second range, ASIC mine in the giga-hash to tera-hash range), and the bitcoin protocol automatically adjust the diffculty and rate of production of bitcoins to the mining power of the whole network.
Litecoins can be mined on GPUs, and because their mining process is very memory dependant, there aren't such dramatic jumps between hardware generations.
Currently Litecoins are still mostly mined on the GPU. But some server-class CPUs can still mine litecoin in the same speed range, and the current experimental FPGAs aren't much more faster than GPUs.
1) Why would 4K game need more GPU power then FullHD? Isn't increased number of pixels compensated for by not needing AA?
2) If this card is to do anything with 4K, it supposed to have HDMI 2. For many people and use-cases 30" monitors are small, and bigger monitor (called "TV") have only HDMI, no DP. Maybe someone from AMD can tell us here, or Michael perhaps using his channels, when HDMI 2 is supposed to be standard on graphic cards? I tought there will be no HDMI 1.4 already, especially for $1500 card (!), so it seems manufacturers run into some issues with HDMI 2, licensing perhaps?
1) No, it is not compensated. The GPU needs to calculate 4 times the number of pixels. AA is always lighter on the GPU, especially post-processing AA.
2) HDMI 2.0 hardware is almost non-existent on the market. I think Panasonic TX-L65WT600 was the first, and it is only launched in some regions.