sonnet:"I guess you might not be too smart, if you do not understand that the power measured is referred to the total system and not just the cpu itself.. "
FAIL... hey I'm maybe not so smart you expect but in fact you are Stupid! (this is a statement of fact not a insult)
In fact the intel cpu is the "better" cpu. it also clocks at 3,5ghz vs 4-4,6ghz on the amd side.
imagine what is if intel clocks he cpu at 4ghz and ~125watt TDP...
AMD's 8350 only win some benchmarks because this cpu burns the energy away like a barbecue broiler.
While I don't really want to get into the turd fight here, but FTA, they claim, voltage was measured between the CPU and the 12V carrier going from the PSU.
Assuming the PSU has only a single rail (which is quite common again, as the dual rail thing was put to rest not too long ago), a motherboard typically has 2 connectors.
The ATX connector, carrying various voltages for overal motherboard usage.
The ATX12 (I belive it is called) connector, carrying nothing but GND and 12V.
PCI-e can ... well let me just quote wikipedia:
I am having serious doubts, that all that power will be routed through the default ATX connector. I believe it is quite safe to say, that the ATX connector is used for the 10 Watt and 25 Watt allowed for non graphic card connections, e.g. sound, network etc that tap into the PCI-e power.PCI Express Graphics 1.0 (PEG) cards may increase power (from slot) to 75 W after configuration (3.3 V/3 A + 12 V/5.5 A). PCI Express 2.1 increased the power output from an x16 slot to 150 W
Now lets assume that there is only 1 graphics card installed into the system. Also assume that it is only PCI-e, 75 Watt. Combine that with the TDP of 125 of the CPU, you are already at 200 Watt, if you max it all out. 200 W @ 12 V is doing 16 amps. That's a lot of juice going through that regular ATX connector.
So I think it is quite safe to assume, that both graphics and CPU power is being routed through that ATX12V connector. 3.3V/3A is then probably done by the same power unit on the motherboard that supplies the CPU with its 3.3V (I/O still does 3.3 I believe, i could be wrong). And if it doesn't, those 3amps can still come from ATX.
Even using an integrated graphics card won't fix those results, as it's very likely the same power-rail is being tapped internally.
Based on that assumption, I think it is quite safe to say, that article is could be using the wrong testing methodology.
P.S. Please don't compare CPU's on a clock-per-clock basis. This isn't an accurate measurement whatsoever. It never was (thought it was somewhat related in the past) and it never will be (Not Ever).
@oliver But they said up to 168 Watt, not all the time. So it's unclear if this was turbo core, some strange spike or even overclocking (intel sponsored test).
Would be really great if anyone with the equipment and skills could re-test this.
These people are the most advanced experts in Germany.
Intel lost the European anti-trust law lawsuit and paid 1,5 billion euros because of a statement these heise.de/C't experts.
so in fact you are a FOOL if you think you can mess with these kind of experts.
Well, it's a sligt improvement over Bulldozer and we knew already that wasn't competitive. Even AMD's CEO acknowledges that. So, no surprise there.
368 Watt under heavy loadSo the FX 8150 uses 172 Watt? I wonder why nobody mentioned that in the past.the consumption of 368 Watt from the FX8350-Testsystem is lower than 372 Watt from the FX-8150 even if the new one uses 4.0 instead of 3.6 GHz
Seriously: I'm sure there's something wrong at heise. And may it just be that they got a faulty CPU by accident.