If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
I'd say this is a good start, but I think that the real benefit will come with lowered voltages.
On a related note, is there any way at all to lower the GPU voltage through software in ubuntu? I have to use fglrx because I need ATI powerplay to keep my laptop from getting too hot under normal use. This forces me to use the old LTS ubuntu since my mobility x700 is no longer supported. I would love to upgrade to ubuntu 10.04.
A couple quick notes regarding the current pm state:
- Voltage drop is already implemented for r1xx-r5xx chips in the new pm code
- Some additional asic features (like dynamic sclk and dynamic voltage where the asic scales the clocks/voltage rather than the driver doing it) are also already implemented in the driver for r1xx-r5xx asics. Whether or not these get enabled depends on the flags in the power state entry.
- Most r6xx/r7xx desktop cards have power saving modes
Hi! I wrote to another thread (http://www.phoronix.com/forums/showt...t=23447&page=6) about my experiences of the power consumption improvements, and only afterwards I remembered this article. I know that Thinkpad is frugal with power, but the consumption measured here is a lot less than on my HP NC8430, so I'd like to see where this test computer arrives with fglrx on Lenny or Ubuntu Hardy plus on Windows, just to compare. At the moment I'm really satisfied with the power management's temperature control, but not that satisfied with the power consumption that is practically double of that of Windows.
So +1 for comparison request with fglrx (and Windows, if possible)
I was wondering, is there anyway to volunteer to help get some enhancements for 5xxx mobile GPUs moving forward?
I have an Acer laptop (manu # LX PMB02 171), which has an ATI HD5650 GPU, and the battery on my laptop seems to chew through quite fast, and I get the feeling this is probably due a lot to my GPU running at full speed the entire time.
The cpu monitoring apps show CPU speed drops back to 1.2Ghz, which is what I'd expect with an i5.
So as per the comment above about lack of access to a laptop with these cards in them, is it possible to volunteer to test?
If you are very lucky you can disable the ati card in bios and use the one integrated in the i5. Then you can even decode h264 (up to l5.1) with vaapi when you run this script on a distro with 2.6.35 kernel and libdrm 2.4.21 (like u maverick, d squeeze could be partially upgraded too).
The 4550 is what i use for testing xvba - always when i swap it with a 5670 i only see artefacts. Penumbra is not really a demanding game. I like to compare Unigine Heaven Win+Linux - for that i need a hd 5 card. I do not play games on my ati test box, i have got a system with Nv 8800 gts 512 if needed - sadly no gtx 460 (or gts 450).