Announcement

Collapse
No announcement yet.

Further Testing Shows More Hope For ATI Gallium3D

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by H.i.M View Post
    I havent read about it yet, but i would be so happy to get my power consumption on my notebook reduced. Is there work in progress?
    The
    http://wiki.x.org/wiki/RadeonFeature
    shows that it is quite complete, though one of the devs told us that PM is complicated since it reaches in every other function (flickering problems when reclocking memory and so on). But I guess you should give it a shot.

    I should test it the coming days. At the moment I still have fglrx on my HD3870, but I can compare whole system power usage the next days with the free driver set.
    (b.t.w. Catalyst on WXP vs. Linux fglrx shows all the time about 5W more usage on Windows (whole system), I wonder where that comes from (might also be background programs))


    CPU-utilization during tests
    Vote +1 from me!

    Comment


    • #42
      Originally posted by glisse View Post
      llvm is not adapted to GPU and i don't see it being usefull. That being said i am pretty sure shader optimization won't improve much perf in benchmarks from this article.
      If shaders (particularly fragment shaders) aren't a problem, then how come raising the resolution makes it perfom worse (comparing to Catalyst)? My only guess if buffer management and tiling support, but could these make such a difference?

      My reasoning behind this is that the CPU should be idling more when you raise the render resolution, so it's probably not the bottleneck, but I don't know much about driver programming and could be wrong.

      Also, Michael, on the last page of the article you say:
      "The open-source Catalyst driver, however, still had an average frame-rate above 100 FPS and this newest was 75% the speed of the proprietary code from early 2009."
      Which is probably wrong

      Comment


      • #43
        Originally posted by Adarion View Post
        The
        http://wiki.x.org/wiki/RadeonFeature
        shows that it is quite complete, though one of the devs told us that PM is complicated since it reaches in every other function (flickering problems when reclocking memory and so on). But I guess you should give it a shot.
        Are there alle green marked features available in Gallium 3D? Cause the headline says that this matrix is for: "radeon (xf86-video-ati) for 2D; radeon, r200, r300, r600 Mesa and r300, r600 Gallium drivers"

        In the matrix is no differentiation between them.

        It would be nice to see tests about power-consumption with different chips on Phoronix. Specifically I am intrested in Thinkpad series T4* and T6*.

        Thanks,
        H.i.M

        Comment


        • #44
          Originally posted by H.i.M View Post
          Are there alle green marked features available in Gallium 3D? Cause the headline says that this matrix is for: "radeon (xf86-video-ati) for 2D; radeon, r200, r300, r600 Mesa and r300, r600 Gallium drivers"
          The gallium3d/classic split is only relevant for the 3d part of the graph, and r600c and r600g are basically at feature parity, so it all applies to r600g.

          Powersaving is in the kernel. It works well, but is not as aggressive as the binary driver.

          Comment


          • #45
            This is really great! The dev's are really great.

            THANKS!

            And thanks to Michael too for the work and keep us informed.

            THANKS!

            Comment


            • #46
              Originally posted by pingufunkybeat View Post
              Powersaving is in the kernel. It works well, but is not as aggressive as the binary driver.
              Also note that the power saving code needs to be manually enabled on most systems, following the instructions in RadeonFeature. Default for most systems is full power.

              Comment


              • #47
                You're right, I keep forgetting it. As root, type either:

                echo dynpm > /sys/class/drm/card1/device/power_method

                or

                echo low > /sys/class/drm/card1/device/power_profile

                The first one is dynamic, the second one will drop everything to lowest state.

                Comment


                • #48
                  Originally posted by pingufunkybeat View Post
                  The first one is dynamic
                  The first one just doesn't work
                  ## VGA ##
                  AMD: X1950XTX, HD3870, HD5870
                  Intel: GMA45, HD3000 (Core i5 2500K)

                  Comment


                  • #49
                    Dynamic power management does NOT WORK with MULTIPLE MONITORS (says so).

                    Would be REALLY nice if it did.... Does anybody actually use only 1 monitor??

                    Comment


                    • #50
                      It doesn't work even with one monitor... too often it "forgets" to undervolt the gpu or to increase the frequencies when you need it.
                      ## VGA ##
                      AMD: X1950XTX, HD3870, HD5870
                      Intel: GMA45, HD3000 (Core i5 2500K)

                      Comment

                      Working...
                      X