Announcement

Collapse
No announcement yet.

Here's Why Radeon Graphics Are Faster On Linux 3.12

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by Ericg View Post
    On any OTHER distro (not *Buntu based) Yuuuuuuup. But Canonical hasn't gotten their heads out of their asses yet and switched yet. Someone said that it was because of bugs in the driver but im finding it hard to believe that THEY are hitting bugs that no one else has.
    There was a bug in kernel 3.10 where the intel_pstate driver used the turbo frequency as the base performance frequency, and that used to heat things up more than it should have. But that issue was solved in 3.11, but Ubuntu hasn't enabled it back just yet.

    Comment


    • Originally posted by Michael View Post
      Then it should be fixed or the default setting changed within the kernel.
      Agreed! Everyone I know uses the default "ondemand" governor. While I'd like to use the 'performance' governor, the power consumption prevents me from doing so. Almost every user is going to be using 'ondemand'. A good user experience demands that things just work with the default settings. If thats not the case, then someone needs to change the defaults!

      Comment


      • Originally posted by GreatEmerald View Post
        And will make your FPS dip to 30 if 60 can't be sustained, instead of just 59... No, you should have VSync on only for games you know will never dip below 60 (or whatever your refresh rate may be).
        Umm. No. It would probably be 58 fps not 30. I know cause I had NS2 and with vsync on the fps ranged from 45 - 60 fps. Vsync just means that it wont display partially rendered frames, it will finish rendering the current frame, display it, and then render the most current frame, display it, etc, etc.

        Comment


        • Originally posted by GreatEmerald View Post
          There was a bug in kernel 3.10 where the intel_pstate driver used the turbo frequency as the base performance frequency, and that used to heat things up more than it should have. But that issue was solved in 3.11, but Ubuntu hasn't enabled it back just yet.
          Interesting that they hit such a bug, I didn't hit that bug on Arch. Wonder if it got pulled in when they were making their 'changes' to the kernel.
          All opinions are my own not those of my employer if you know who they are.

          Comment


          • Originally posted by gururise View Post
            Agreed! Everyone I know uses the default "ondemand" governor. While I'd like to use the 'performance' governor, the power consumption prevents me from doing so. Almost every user is going to be using 'ondemand'. A good user experience demands that things just work with the default settings. If thats not the case, then someone needs to change the defaults!
            It's not a default setting in the kernel. It's a setting forced by a startup script in Ubuntu and you can't turn it off (unless you delete the script manually). So this is obviously Canonical's fault. Too bad we didn't get benchmarks with multiple distributions, we'd at least see how bad Canonical is. Generally, "ondemand" is okay for most people, giving you approx. 75-80% performance out of your CPU. You'll never get 100% unless you use up all CPU cores (very unlikely for most people, even gamers). However, there are applications where ondemand is absolutely unacceptable. Those are gaming, benchmarking, and running computationally-intensive applications where the running time matters (compiling too).

            Comment


            • Originally posted by marek View Post
              It's not a default setting in the kernel. It's a setting forced by a startup script in Ubuntu and you can't turn it off (unless you delete the script manually).
              It isn't? I know acpi_cpufreq is set to ondemand in the kernel in openSUSE. But I'm not sure what it is when you do a make defconfig (don't have kernel sources on this PC). Though the fallback in case ondemand isn't built is performance.

              Comment


              • Default governor is performance.
                Code:
                choice
                        prompt "Default CPUFreq governor"
                        default CPU_FREQ_DEFAULT_GOV_USERSPACE if ARM_SA1100_CPUFREQ || ARM_SA1110_CPUFREQ
                        default CPU_FREQ_DEFAULT_GOV_PERFORMANCE
                        help
                          This option sets which CPUFreq governor shall be loaded at
                          startup. If in doubt, select 'performance'.
                Last edited by JS987; 16 October 2013, 01:56 PM.

                Comment


                • What my experiments show

                  Originally posted by s_j_newbury View Post
                  Luke, while technically a graphically very simple GPU accelerated game might be well be non-CPU limited at the extreme, I'm not sure what you're trying to measure. It could well be you're hitting the GPU fillrate limit, exceeding the internal bandwidth of the card and at such high frame-rates I can't see how it's meaningful.
                  Is that the linux3.12 kernel only partially reduced the effect of the governor setting on AMD Bulldozer and fixed the problem (at least in this one case) on Phenom II. I have no other games that do not peg the CPU to continuous full speed, taking the governor out of the picture. If anything about the card were the issue, I would expect the CPU governor to be out of the picture. Strangely, the smaller HD5570 on Phenom II at 3.7 GHZ gets more fps by about 1/7 than the bigger HD6750 on Bulldozer at 4.4GHZ.

                  Until rather recently, Critter had issues with running rough on Mesa drivers, dropping frames and visibly stuttering unless the framerate never went under about 300fps. This was because frames would be dropped in strings, and a high framerate kept down the time covered by an entire string of missing or duplicate frames. Both Radeon and Nouveau drivers were affected, I'm pretty sure Intel was too, though it's been awhile since I've tried to ply Critter on the netbook. Nouveau, due to the lack of ability to reclock the GPU, had trouble getting a GTX450 over about 250fps at 1080p, at which point the stuttering was very noticable. A few months ago, another Nouveau test found the roughness in Critter greatly reduced, and the highest framerates in Scorched3d up around 30fps from around 18. About all I use Nvidia stuff for these days is to see how Nouveau is doing from time to time.

                  I thought and still think it is NUTS that a game whose progenitors ran on video arcade machines of 30 years ago would have issues running on ANY driver on ANY graphics card produced since whatever version of OpenGL it was written in came out, but it does or at least did due to the dropped frame roughness issue. I would have expected CPU and GPU to sit at stone cold idle playing a Galaxians style game, but due to how it was written this is far from the case. In fact, nothing else I have (game or otherwise) will generate more heat in my graphics cards, as Scorched3d and 0ad are both so CPU limited that the graphics card is of reduced importance it seeems. Due to that, I use both Critter and Scorched3d to benchmark graphics drivers in my machines with games I actualy play. Critter is more repeatable and works the smaller HD5570 as hot as it does an Nivdia GT520 on its blob, Scorched3d is much more graphically demanding but dislikes the sb backend that seems to help a bit in 0ad. 0ad is so CPU bound that even 3 or 4-way SLI on Nvidia's blob would probably show little or no benefit, while finding a way to run the AI in one instance of the game connected to the player game running as a multi-player game would possibly double framerates when a few hundred characters are on the board-and do so even on the ATI HD5570.

                  Sorry this is not using a test suite written for games I do not have and cannot easily download due to bandwidth issues, but I just try to add what information I do get to the total body of information that is out there. I'm using some less common hardware, so I figure the results might help somebody,

                  Comment


                  • Originally posted by Luke View Post
                    Sorry this is not using a test suite written for games I do not have and cannot easily download due to bandwidth issues, but I just try to add what information I do get to the total body of information that is out there. I'm using some less common hardware, so I figure the results might help somebody,
                    Well, it's good to have it confirmed that there's also a difference on AMD CPUs.

                    Comment

                    Working...
                    X