Announcement

Collapse
No announcement yet.

Here's Why Radeon Graphics Are Faster On Linux 3.12

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by marek View Post
    It's not a default setting in the kernel. It's a setting forced by a startup script in Ubuntu and you can't turn it off (unless you delete the script manually).
    It isn't? I know acpi_cpufreq is set to ondemand in the kernel in openSUSE. But I'm not sure what it is when you do a make defconfig (don't have kernel sources on this PC). Though the fallback in case ondemand isn't built is performance.

    Comment


    • Default governor is performance.
      Code:
      choice
              prompt "Default CPUFreq governor"
              default CPU_FREQ_DEFAULT_GOV_USERSPACE if ARM_SA1100_CPUFREQ || ARM_SA1110_CPUFREQ
              default CPU_FREQ_DEFAULT_GOV_PERFORMANCE
              help
                This option sets which CPUFreq governor shall be loaded at
                startup. If in doubt, select 'performance'.
      Last edited by JS987; 10-16-2013, 01:56 PM.

      Comment


      • What my experiments show

        Originally posted by s_j_newbury View Post
        Luke, while technically a graphically very simple GPU accelerated game might be well be non-CPU limited at the extreme, I'm not sure what you're trying to measure. It could well be you're hitting the GPU fillrate limit, exceeding the internal bandwidth of the card and at such high frame-rates I can't see how it's meaningful.
        Is that the linux3.12 kernel only partially reduced the effect of the governor setting on AMD Bulldozer and fixed the problem (at least in this one case) on Phenom II. I have no other games that do not peg the CPU to continuous full speed, taking the governor out of the picture. If anything about the card were the issue, I would expect the CPU governor to be out of the picture. Strangely, the smaller HD5570 on Phenom II at 3.7 GHZ gets more fps by about 1/7 than the bigger HD6750 on Bulldozer at 4.4GHZ.

        Until rather recently, Critter had issues with running rough on Mesa drivers, dropping frames and visibly stuttering unless the framerate never went under about 300fps. This was because frames would be dropped in strings, and a high framerate kept down the time covered by an entire string of missing or duplicate frames. Both Radeon and Nouveau drivers were affected, I'm pretty sure Intel was too, though it's been awhile since I've tried to ply Critter on the netbook. Nouveau, due to the lack of ability to reclock the GPU, had trouble getting a GTX450 over about 250fps at 1080p, at which point the stuttering was very noticable. A few months ago, another Nouveau test found the roughness in Critter greatly reduced, and the highest framerates in Scorched3d up around 30fps from around 18. About all I use Nvidia stuff for these days is to see how Nouveau is doing from time to time.

        I thought and still think it is NUTS that a game whose progenitors ran on video arcade machines of 30 years ago would have issues running on ANY driver on ANY graphics card produced since whatever version of OpenGL it was written in came out, but it does or at least did due to the dropped frame roughness issue. I would have expected CPU and GPU to sit at stone cold idle playing a Galaxians style game, but due to how it was written this is far from the case. In fact, nothing else I have (game or otherwise) will generate more heat in my graphics cards, as Scorched3d and 0ad are both so CPU limited that the graphics card is of reduced importance it seeems. Due to that, I use both Critter and Scorched3d to benchmark graphics drivers in my machines with games I actualy play. Critter is more repeatable and works the smaller HD5570 as hot as it does an Nivdia GT520 on its blob, Scorched3d is much more graphically demanding but dislikes the sb backend that seems to help a bit in 0ad. 0ad is so CPU bound that even 3 or 4-way SLI on Nvidia's blob would probably show little or no benefit, while finding a way to run the AI in one instance of the game connected to the player game running as a multi-player game would possibly double framerates when a few hundred characters are on the board-and do so even on the ATI HD5570.

        Sorry this is not using a test suite written for games I do not have and cannot easily download due to bandwidth issues, but I just try to add what information I do get to the total body of information that is out there. I'm using some less common hardware, so I figure the results might help somebody,

        Comment


        • Originally posted by Luke View Post
          Sorry this is not using a test suite written for games I do not have and cannot easily download due to bandwidth issues, but I just try to add what information I do get to the total body of information that is out there. I'm using some less common hardware, so I figure the results might help somebody,
          Well, it's good to have it confirmed that there's also a difference on AMD CPUs.

          Comment

          Working...
          X