Show me your numbers... Right now....
Originally Posted by crazycheese
Don't spread bullshit.
Power here in New York is much more expensive than it is in the rest of the USA.
The CPU uses what? 48W more power?
Using the PC 8 hours a day...
48 * 8 = 384Whr more power per day
30 days a month...
384Whr * 30 days = 11.52kWh per month
Cost per kWh here in New York...
57 dollars for 1000kWh (1MWh).... and that's assuming I use it during the worst time of day.
Extra electrical cost to use the AMD CPU rather than the Intel CPU per month...
11.52kWh / 1000kWh = x/$56.63
Whoopie, we saved 65 cents a month.
Price difference between CPUs...
AMD FX-8350 = $220 (NewEgg.com)
Intel 3770K = $320 (NewEgg.com)
Difference = $100
How many months is it going to take to pay for the CPU price difference?
$100 / $0.65 = 153 months
How many years is that?
But wait, there's more... Intel motherboards are generally more expensive than AMD boards as well.
Maybe an extra 50 dollars more. So we need to add another 6 years onto that....
Of course, I love advanced fab chips, but when companies like Intel intentionally make their chips and motherboards overpriced by so much, any kind of power savings goes flying out the window instantly.
Last edited by Sidicas; 10-23-2012 at 09:05 AM.
I don't think you're understanding what you're reading.
Originally Posted by necro-lover
Peak power is worthless for anything other than determining what power supply to buy...
You cannot use peak power to make comments about how much power the PC uses over *ANY* period of time.
TDP, by definition is the expected amount of power the CPU uses over a period of time.
Legitreviews says that the average load power of a PC with the 8350 is 219 watts compared to 163 watts of the Intel chip while running CPU benchmarks..
Which is exactly what the TDP difference between the chips predicts.. Running games, the difference is even less.
Source link here...
You know it takes a *LOT* of power to turn over a car engine but that doesn't mean you drive your car with your starter running 24/7. Peak power is an almost worthless measurement, especially for a CPU that can ramp up and down within time frames as small as 1/1000000 of a second.
Last edited by Sidicas; 10-23-2012 at 09:03 AM.
Pay 45% more for a 3770k CPU that gives nearly the same performance (and is slower in some cases) just to save a few dozen watts? No thanks Intel.
I pay $200 for 1000 kWh (without 20% tax).
Originally Posted by Sidicas
$50 motherboard support Intel 3770K.
Let me see if I understand correctly.
i7 3770 is essentially Intel's flagship processor, and the FX-8350 beats it in some benchmarks? Even if it does lose overall (which it does) and gets trounced in others (which it also does), this is excellent news for AMD. Coupled with a much better GPU, it's a viable option again.
BTW, props for a good article. I'm often critical of poor reporting on phoronix, but articles like this offer real value!
Indeed, this does put AMD in a better position than before. But, and this is a big but, AMD's wins are based on a higher clock speed, higher TDP and a die which is twice the size of Ivy Bridge. The latter means that while AMD may be pricing these aggresively, their profits are still hurting.
Originally Posted by pingufunkybeat
This is true, but it also shows that the new architecture is not the fail it originally seemed, and that once the growing pains are taken care of, they might have a competitive platform again.
Originally Posted by bug77
I'm guessing that moving to a smaller process is really what they need now. How realistic this is, I don't know. Didn't Global Foundries lauch a 28nm fab this year?
110 vs 223 Watts system consumption without graphics card.
Originally Posted by Sidicas
113 Watts difference or 0.113 kWh more if burning it for one hour.
Cheapest 1kWh in europe = 0.25 eurocent = 0.50 dollar cents.
Using 8 hours/day (assuming full load is 2 hour, the rest 6 hours is idle) - 0,226 kWh / day extra.
2 hours load, every day, for six months: 0,113 x 2 x 30 x 6 x 0,25 = 10 € extra = 20$ extra in six months
8 hours load, every day, for six months (your model): 0,113 x 8 x 30 x 6 x 0,25 = 40 € extra = 80$ extra in six months
If you use it 24/7 under full load for six months: 0,113 x 24 x 30 x 6 x 0,25 = 122€ extra = 244$ extra in six months
This means Intel config pays 244$, amd pays 488$ for electricity bill in six months.
This is assuming idle use is same - we see very good progress here, kudos to AMD!
Due to this we discard idle running costs, because AMD machine uses only 10% more of Intel when Idling (this is OK). We calculate only extra costs.
Because the previous generation (I have that - Athlon II x4) had HUGE idle consumption difference (120 Watts compared to 60).
Lowering idle to ~level of competition is more important in desktop market. If your system consumes 60Watts more constantly when its up, its more fatal than say 60 or 100 Watts more when full load in this market.
For high-performance market its opposite, there full load plays vital role. Currently its not shining...
This assumes PSU PFC at 1 & 100% efficiency -> normal deviation plus 20% $ more (Bronze & Silver)
And it does not include GPU/APU, since we assume them to be the same, ofc.
Also, small unrelated note: Higher Hz and higher consumption result in higher wear-out levels. AMD has higher Hz and higher consumption for same performance.
Means, AMD system will fail at much more probable rate than Intel assuming the quality is same and technology level is equal when it comes to wear-out.
0,057c = 0,029 eurocent
Originally Posted by Sidicas
Thats criminally cheap or wrong. Or both.
Last edited by crazycheese; 10-23-2012 at 11:01 AM.