Announcement

Collapse
No announcement yet.

Power Management: ATI Catalyst vs. Open-Source ATI Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Melcar View Post
    Well, radeon on my laptop (200M) reduces battery life nearly by 1/3 (45min to 30min). If I load up DynamicClocks I gain around 10min.
    This my experience as well from running an X1400 on a Thinkpad and various distros. Clocking down fglrx has always given me much better battery life than using the defaults in radeon and the laptop is certainly cooler as well.

    What relevant and working radeon options are actually possible to specify in xorg.conf to get better battery life? The docs are not really clear on this.

    Comment


    • #22
      Originally posted by shamooki View Post
      You sound like you just took a statistics class Through my work I often witness how people who are trained in statistics accusing those who acquire data to misuse statistics. While it's good to point out weak points in an analysis, I strongly believe criticism should be constructive.

      To be precise: How would you quantify the uncertainty of the measurements?
      You got me there I study mathematics, and I have a course in statistics and time series analysis behind me.

      It is not possible to quantify the uncertainty, it have to be measured or know. Yes, you measure the uncertainty with equipment that you know is more precise than the one you want to know the uncertainty for

      As for this test with power measurement, I don't how how they where measured or with what equipment, but without that information the test is useless, and it doesn't tell anything.

      If this is not possible, as it sometimes is, you have to do more than 28 measurements for each point. Why 28, because then you are in the 95% confidence interval.

      Which means that there is a 95% change that what you have measured is the right values.

      100% is impossible. That formula doesn't allow this.


      The Phononix test suite is worth nothing (sorry to say so) to anyone with an engineering background. The results doesn't tell a damn thing.

      Comment


      • #23
        Originally posted by Louise View Post
        As for this test with power measurement, I don't how how they where measured or with what equipment, but without that information the test is useless, and it doesn't tell anything.
        The results were directly ready by the Phoronix Test Suite off of the exposed ACPI sensors on the laptop.
        Michael Larabel
        https://www.michaellarabel.com/

        Comment


        • #24
          Originally posted by Michael View Post
          The results were directly ready by the Phoronix Test Suite off of the exposed ACPI sensors on the laptop.
          Okay, this means that if you do the measurement again on another laptop you will of course get other results.

          As you know, components in laptops are the cheapest components they can get away with in order to lower cost.

          The best you can do is to open the laptop and find the components that actually do the measurement and look in their data sheets.

          Or try and ask on a electronics newsgroup how accurate they would guess such components in laptops are.

          I wouldn't be surprised if each measurement have a +/- 10% accuracy.

          Comment


          • #25
            Originally posted by Michael View Post
            The results were directly ready by the Phoronix Test Suite off of the exposed ACPI sensors on the laptop.
            Here is just a quite plot I found with Google.



            On each of these plots you can draw any curve you like as long as you stay within the error bars, and any curve you draw will be correct.

            Comment


            • #26
              I guess there are two kinds of accuracy that need to be considered.

              One is absolute accuracy; if the test says 19 watts then what is the likely range of "true" values ?

              One is the relative accuracy; if you measure *exactly* the same thing 10 times in a row what distribution of results do you get ? Put differently, if you *compare* two sets of measurements with the same equipment how valid is the comparison ?

              For an article like this (where the goal is to establish differential readings for the same system and same tools with different drivers) the second form of accuracy is probably more relevent, ie absolute accuracy (calibration etc..) doesn't matter as long as two sets of measurements closely related in time give similar results.

              Louise, does that make sense ? If so, what information would you use to determine the size of the error bars ?
              Last edited by bridgman; 06 March 2009, 10:45 PM.
              Test signature

              Comment


              • #27
                Originally posted by bridgman View Post
                I guess there are two kinds of accuracy that need to be considered.

                One is absolute accuracy; if the test says 19 watts then what is the likely range of "true" values ?

                One is the relative accuracy; if you measure *exactly* the same thing 10 times in a row what distribution of results do you get ? Put differently, if you *compare* two sets of measurements with the same equipment how valid is the comparison ?

                For an article like this (where the goal is to establish differential readings for the same system and same tools with different drivers) the second form of accuracy is probably more relevent, ie absolute accuracy (calibration etc..) doesn't matter as long as two sets of measurements closely related in time give similar results.

                Louise, does that make sense ?
                It is only in discrete cases (things you can count) where is it possible to do exact measurements. E.g. how many cars passed a censor in an hour.

                I don't understand why you would go with relative accuracy (distribution) for this test.

                Distribution is used when you want to predict values you haven't measured.

                Btw. The plots really need horizontal tick marks and unit, so the number of measurements can be seen.

                Comment


                • #28
                  This is something I've been meaning to ask for a while... does DynamicClocks actually do anything on my 9250? For that matter, does even manual clock adjustment have any effect on power draw?

                  Pretty futile to try considering the GPU's power consumption is nothing compared to my P4, but I was curious.

                  Comment


                  • #29
                    I've manually underclocked my R500 and R700 but the effect of clock speed was minimal. Underclocking the card's memory might give you a watt or two, but don't expect miracles. However, underclocking allows you to reduce the GPU voltage, which has a positive impact in power consumption (and noise!)

                    Personally, I've updated the BIOSes of both cards with custom voltage and clock tables. This works rather well, although YMMV.

                    Now, if only R700 wasn't so trigger happy... Scrolling a window or watching a (SD) video shouldn't launch the high-performance profile. Even underclocked, this card has more than enough juice for day to day use, no need to put the memory to 2GHz every time I move my mouse dammit!

                    Comment


                    • #30
                      Originally posted by BlackStar View Post
                      I've manually underclocked my R500 and R700 but the effect of clock speed was minimal. Underclocking the card's memory might give you a watt or two, but don't expect miracles. However, underclocking allows you to reduce the GPU voltage, which has a positive impact in power consumption (and noise!)

                      Why do the apps mess with clock speed to conserve power? You can't do it. The only way to really make a dent in power usage on a video card or a cpu is to lower voltage. If you don't lower voltage when you drop clock you don't save any significant power. My cpu uses 32 watts at 1.1 volts at 2.5ghz stock speed. It uses 22 watts max at 1.0 volts at 2.2 ghz with loads of front side bus speed, and memory speed. Thats at 275 mhz fsb. Dropping to 2ghz and 250 mhz fsb and .95 volts saves another 4 watts. It'll run under windows with .925 volts but not under linux.

                      Trimming .2 volts on a 7600gs drops power usage from 24 watts to 16 watts but you have to lose 50 mhz for stability. Not dropping voltage and losing 50mhz saves 1 watt.

                      Most gpu's and cpu's with good voltage regulation on system will drop .1 volt and save about 15 percent on power without touching clocks. You can't save 15 percent with any of these cpu or gpu clock schemes unless you have a bizarre system with almost no interrupt activity and it's really sleeping a good amount of time. CPU's and GPU's are going to have to give up voltage control to bios and systems acpi calls.

                      PS. I've got pure DC desktop system so if you can't achieve the same results as me it's because a capacitor in a power supply is never as voltage stable as 120lbs of lead acid batteries.

                      Comment

                      Working...
                      X