Originally posted by crazycheese
View Post
Don't spread bullshit.
Power here in New York is much more expensive than it is in the rest of the USA.
The CPU uses what? 48W more power?
Using the PC 8 hours a day...
48 * 8 = 384Whr more power per day
30 days a month...
384Whr * 30 days = 11.52kWh per month
Cost per kWh here in New York...
(https://www.nationalgridus.com/niaga...our_charge.asp)
57 dollars for 1000kWh (1MWh).... and that's assuming I use it during the worst time of day.
Extra electrical cost to use the AMD CPU rather than the Intel CPU per month...
11.52kWh / 1000kWh = x/$56.63
X=$0.65
Whoopie, we saved 65 cents a month.
Price difference between CPUs...
AMD FX-8350 = $220 (NewEgg.com)
Intel 3770K = $320 (NewEgg.com)
Difference = $100
How many months is it going to take to pay for the CPU price difference?
$100 / $0.65 = 153 months
How many years is that?
12.75 years
But wait, there's more... Intel motherboards are generally more expensive than AMD boards as well.
Maybe an extra 50 dollars more. So we need to add another 6 years onto that....
Of course, I love advanced fab chips, but when companies like Intel intentionally make their chips and motherboards overpriced by so much, any kind of power savings goes flying out the window instantly.
Comment