Announcement

Collapse
No announcement yet.

Linux 3.5 Can Massively Boost AMD Radeon Graphics

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Wyatt View Post
    Ah, I see. Thanks for clearing that up, though I really wish this was something that was addressed in the article in the first place.

    Thanks, I'm just speculating. My personal dream from these changes....a killer APU with openCL for my Gentoo. My Phenom II x6 eats emerge @system on a budget.

    ~Jux

    "CFLAGS=-march=native -Os -pipe -ggdb"

    Comment


    • #32
      Originally posted by crazycheese View Post
      Thats a circle, because if the driver is bad, no one will purchase gddr stacked card. Something should come first, I presume from development side.
      Those cards were not affected by these commits. There are certainly more tweaks that can enabled to improve performance on them as I mentioned in another reply, but in this case these commits will not affect them. Also, gddr cards will always perform better than ddr cards, so if you want better performance (regardless of driver improvements) from a 5450 or 6450, get the gddr version.

      Comment


      • #33
        Originally posted by evolution View Post
        Furthermore, I'm also expecting that the "default" profile will stop using always the maximum frequency of the GPU, because that kills the GPU lifetime. (Nouveau does the opposite, btw...)
        That's a fucking stupid thing, my HD5870 is in OpenCL full load 24/7. Maybe it will die in 15 years instead of 20, but who cares? Peoples keep saying the same thing for the cpu too, I have an old P4 overclocked since 12 years which I still use daily.
        ## VGA ##
        AMD: X1950XTX, HD3870, HD5870
        Intel: GMA45, HD3000 (Core i5 2500K)

        Comment


        • #34
          Originally posted by evolution View Post
          Furthermore, I'm also expecting that the "default" profile will stop using always the maximum frequency of the GPU, because that kills the GPU lifetime. (Nouveau does the opposite, btw...)
          The "default" profile uses the settings marked as "default" in the VBIOS. Some cards have high default settings in the VBIOS, others have low default settings.
          Test signature

          Comment


          • #35
            The default profile uses the clocks that are set by the vbios at boot; i.e., the clocks that are set when you boot the computer before the OS loads. As Bridgman noted, on some boards they are higher, on others they are lower. Whether or not they are the same as the high clocks varies from board to board. Nouveau does the same thing. The only difference is that nvidia tends to set the boot up clocks lower.

            Comment


            • #36
              By the way, just the usual reminder
              Micheal please stop benchmarking APUs with low clocks
              ## VGA ##
              AMD: X1950XTX, HD3870, HD5870
              Intel: GMA45, HD3000 (Core i5 2500K)

              Comment


              • #37
                Originally posted by evolution View Post
                Well, in opposition to the hype I've seen in the article, I'll take a more "conservative" approach: when 3.5 stable arrives, I'll give a new try to the FOSS ATI drivers (I went back to catalyst because of VAAPI and proper PM). From what I've seen, we'll have better 3D performance from now on (but does that apply to r600/r700 cards? I didn't see any card of that generation tested in the article...).

                Furthermore, I'm also expecting that the "default" profile will stop using always the maximum frequency of the GPU, because that kills the GPU lifetime. (Nouveau does the opposite, btw...)

                Finally, in the medium/long run, It'd be nice to have H.264 VDPAU/VAAPI/UVD acceleration. That wold be nice for those who still have weak CPUs (e.g. AMD E-350/Low-End Llanos/Nehalmen Core2Duos).

                Cheers
                unlike nvidia (google bumpgate), always maximum clock has no influence on the lifetime of the gpu.

                It might have an influence on the power circuitry... but you said gpu

                Comment


                • #38
                  Originally posted by agd5f View Post
                  Also, gddr cards will always perform better than ddr cards, so if you want better performance (regardless of driver improvements) from a 5450 or 6450, get the gddr version.
                  The gddr versions also use 5-15W more power, hint hint

                  Comment


                  • #39
                    Originally posted by energyman View Post
                    unlike nvidia (google bumpgate), always maximum clock has no influence on the lifetime of the gpu.

                    It might have an influence on the power circuitry... but you said gpu
                    At very least it produces lot of heat and enough of cooler noise from any more or less powerful GPUs. Which isn't good either. Why not enable smth like dynpm by default? CPU scales frequency by default almost anywhere for ages. Maybe it's time for GPUs as well, esp. powerful ones?

                    Comment


                    • #40
                      Originally posted by 0xBADCODE View Post
                      At very least it produces lot of heat and enough of cooler noise from any more or less powerful GPUs. Which isn't good either. Why not enable smth like dynpm by default? CPU scales frequency by default almost anywhere for ages. Maybe it's time for GPUs as well, esp. powerful ones?
                      Because it's still buggy, and someone has to fix it.

                      Comment

                      Working...
                      X