If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
This is very interesting and useful function, especially for the hi-end big ass GPUs. For my 2900XT, you don't only feel the speed, you can 'Hear' it. If the fan does not revs up all the time using this Power Management code, that's a fantastic thing. I will be more than willing to test it on my 2900XT since it is the worst 'Heater' on the R6/7/8 family so far. However before GPU voltage can be dynamically changed I wouldn't think it can help R600 to cool down much.
However before GPU voltage can be dynamically changed I wouldn't think it can help R600 to cool down much.
In case of my HD 34x0 engine downclocking gave quite nice result already. I did not see much difference when downclocking memory. Not sure what voltage changing will bring. Maybe not much difference to temperature, but some to battery life time?
I really hope to get engine downclocking commited for 2.6.33-rc1. Hopefully Alex will release IRQs before that, so I will able to enable memory downclocking as well for 33-rc1.
In case of my HD 34x0 engine downclocking gave quite nice result already. I did not see much difference when downclocking memory. Not sure what voltage changing will bring. Maybe not much difference to temperature, but some to battery life time?
I really hope to get engine downclocking commited for 2.6.33-rc1. Hopefully Alex will release IRQs before that, so I will able to enable memory downclocking as well for 33-rc1.
heat ~ clock * voltage^2
So voltage should bring more benefit than clock adjustment.
Not sure about the formula (power tends to follow frequency squared as well) but voltage definitely makes a difference. Memory clock usually needs to be dropped before you can reduce the voltage, and the code to drop memory clocks is tricky because it needs to be sync'ed with display refresh to avoid glitching the display.
Yep. Some of the review folks were away this week but back Monday; hopefully we can get the IRQ IP released before all the US folks take off for thanksgiving.
Nothing that sounds hard to implement if you just let Alex release the code
I hope you can pull it off, what I've read is that AMD had some problems with downclocking GDDR5 memory in the past:
Even though ATI has implemented a 2D/3D clock switching model, the power consumption in idle is still quite high. One reason for that is that only the core frequency is reduced, while memory keeps running at full speed for the whole time. During testing I noticed that any memory frequency change will make the screen display flicker, which is probably the reason why AMD chose not to allow dynamic memory clock changes. I am surprised that AMD has not fixed this problem in their second GDDR5-capable GPU.
Today AMD released the world's first GPU that is produced in a 40 nm process. The HD 4770 is aggressively priced around ~$100 and offers great performance for your hard earned dollars. In our testing realized out that the card performs almost on par with the HD 4850. With the amazing 30%+ memory overclock we got on our sample, the HD 4850 will be surpassed easily.
Comment