Originally posted by zamadatix
View Post
Transistors are not that complicated. The more of them, the more die area, the more die area, the less chips made from a waffer. If you put half of them, you get 2 times the number of chips. If you place 1/4 of them, you get 4 times the chips. Not rocket science. So if AMD decided to make a 6400 and a 6500, which could be ideal 1080p solutions for the mainstream market, they could make them RIGHT NOW on the same waffers they are using RIGHT NOW and produce a gazzillion more of them. MSRPs would be sensible and street prices would be sensible as well because the supply could satisfy the demand.
But AMD (and Nvidia of course) knows that it is a new generation of games, PC games are unoptimized anyway, and people need to buy a gpu. They know they can still sell overpriced gpus to impatient gamers, and they know that they can also sell to miners. So why not procude larger dies? It makes them more profit by fleecing more money for the same die areas, and keeping the demand unsatisfied ensures that the clients will keep coming. After all, someone who lacks a gpu for mechanical reasons or simply for obsoletion reasons, will have to buy at some point. Why satisfy him with a 6400/6500 card, when you can sell to the rich kid a 6900 and then have him frustrated and forced to pay through the nose in the future anyway?
The cartel knows that if they supply mainstream cards, which they can easily do if they simply cut down the designs, since most people game at 1080p anyway, the market will be saturated and satisfied and prices will go down, and people will re-pay them 3-4 years later when they upgrade. By refusing to service the mainstream market, they keep getting the same revenue per transistor but also keep the demand/market unsatisfied for future sales. Capiche?
Comment