If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
Your rumor monger is clearly on crack.
It is simply *IMPOSSIBLE* for it to leak 30 Amps, let alone USE it.
The wall plug you plug your computer into is on a 15 amp breaker or fuse (most likely -- north america standard). That covers the ENTIRE computer, i.e. monitor, hard disks, CPU, PRINTER, and whatever other junk you happen to plug in to that circuit.
So assuming that your POWER SUPPLY didn't catch on fire first, the circuit would DEFINITELY break if it drew 30 amps.
You have no electrical knowledge, I can tell, and you have no clue what are you talking about. Better read this: http://en.wikipedia.org/wiki/Ohm%27s_law , and come again then.
Your rumor monger is clearly on crack.
It is simply *IMPOSSIBLE* for it to leak 30 Amps, let alone USE it.
The wall plug you plug your computer into is on a 15 amp breaker or fuse (most likely -- north america standard). That covers the ENTIRE computer, i.e. monitor, hard disks, CPU, PRINTER, and whatever other junk you happen to plug in to that circuit.
So assuming that your POWER SUPPLY didn't catch on fire first, the circuit would DEFINITELY break if it drew 30 amps.
I also don't believe that figure of 30A.
If you search around on the web for the power draw of a 2900xt, you will find out that it leaks about 30W more on idle than 1950xtx.
Assuming a core voltage of about 1.2V it probably leaks 25A MORE than 1950xtx, so it probably leaks more like 40A to 60A.
15 Amp breaker is for 220V or 110V, so it's about 220x15=3300 Watt, or 110x15=1650 Watt
As for R600, 30 Amp @ 1.15v = 34.5 Watt. As you could see 34.5 Watt << 3300 Watt. Your assumption doesn't hold.
===================
If you have done any research agains 2900XT's power supply circuit, you will find out that it has 6 phase of power IC capable of outputting 40 Amps each phase just for the vGPU. That's bloody 240 Amps max in total! If the GPUs isn't such a power hog why would they waste expensive circuitry on it.
Just wanted to express my gratefulness to AMD and the devs woorking on these drivers to make my experience better. Oh god, does it sound like some marketing blah? I hope not.
Powersaving is one of these things I marked as important (besides many others) in the Phoronix graphics survery. But atm it is one of my main concerns since I use only R600 based ones and the powersave/cooldown stuff isn't really complete here.
Stop TCPA, stupid software patents and corrupt politicians!
Comment