Announcement

Collapse
No announcement yet.

Likely Radeon Gallium3D Regression On Linux 3.14 + Mesa 10.2

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    I have an HD6750 and and HD5570 that have been tested

    Originally posted by asdfblah View Post
    Well, it was this thread: http://www.phoronix.com/forums/showt...untu-14-04-LTS
    I don't get it, does it affect ATI/AMD every card, or just cards < r600 ?
    In my tests, no regression from the kernel update to 3.14, only the hyper-Z regression on the Mesa 10.1 to Mesa 10.2 transition, and the newest versions of Mesa 10.2 will tolerate enabling hyper-Z in ~/.profile on both the cards I have tested. The X regression is the worst of all, still present as of last week. When I get a version of X on the second machine (with the HD5570) that restores full performance, I will know that one has been found and fixed. Until then X is pinned on my best machine.

    The only games I have that impose a significant OpenGL load are Scorched3d, Critter (load due to high framerate) and 0ad(mostly CPU limited except at the very start). I don't have the games usually benchmarked, so a kernel regression not affecting these games could be present and I would not see it

    Comment


    • #52
      Originally posted by Luke View Post
      The X regression is the worst of all, still present as of last week. When I get a version of X on the second machine (with the HD5570) that restores full performance, I will know that one has been found and fixed. Until then X is pinned on my best machine.
      Can you bisect the xserver and identify what commit caused the regression?

      Comment


      • #53
        Not likely

        Originally posted by agd5f View Post
        Can you bisect the xserver and identify what commit caused the regression?
        I've never done something like that-and I have throttled cellular bandwidth only, no land line connection

        Comment


        • #54
          xorg is a poor graphic server. how long for wayland!? regressiona are unnerving.

          Comment


          • #55
            Originally posted by Luke View Post
            I've never done something like that-and I have throttled cellular bandwidth only, no land line connection
            Once you have a git checkout doing the bisect doesn't require a internet connection. If you don't use X from git but only releases than you'd better find a wifi hotspot for the initial checkout. As for how to do the bisect itself here is a tutorial . It's for the kernel, not X, but the commands would be the same.

            Comment


            • #56
              I've never built X from source at all

              Originally posted by Ansla View Post
              Once you have a git checkout doing the bisect doesn't require a internet connection. If you don't use X from git but only releases than you'd better find a wifi hotspot for the initial checkout. As for how to do the bisect itself here is a tutorial . It's for the kernel, not X, but the commands would be the same.
              I have never built X or Mesa from source, I run them from PPA's. I've never done anything with GIT other than download small source programs, never done a bisect of anything. If the only way this gets fixed is for me to learn to do a git bisect, dedicate a partitiion to building X and installing all the build dependencies and all the rest, it won't happen. I've never done this, wonder if I should simply stop posting my reports of regressions here, as I am not getting repeated requests for things beyond my ability.

              Comment


              • #57
                I've filed a bug report with Ubuntu/Launchpad against their version

                Originally posted by Ansla View Post
                Once you have a git checkout doing the bisect doesn't require a internet connection. If you don't use X from git but only releases than you'd better find a wifi hotspot for the initial checkout. As for how to do the bisect itself here is a tutorial . It's for the kernel, not X, but the commands would be the same.
                I filed this bug report:

                With Xserver 1.15, OpenGL performance in all games is reduced by almost half, at least with both Radeon/r600g driver Not sure which actual package has the issue. As long a hyper-z is explicitly enabled, mesa version 10.2 (from oibaf PPA) does not change this regression. With hyper-z disabled (the new default) another 10-20% reduction in performance resulted after an approximately 40% loss from the new x server. Critter still ran faster than the screen refresh rate, so it was not a matter of...


                For my own purposes it is enough to run the xserver used by Ubuntu 13.10 with current Mesa from the Oibaf PPA and the current kernel from PPA. For Canonical, however, they need to either get this fixed upstream, patch it themselves, or do as I am doing and reuse the older X server. Otherwise an LTS version of Ubuntu (14.04) will appear where proprietary drivers give no more performance than Mesa did a year ago, and Mesa performance is half what is used to be. Debian will get this too-as will any Steamboxes built on that version. I can't imagine they will be dumb enough to let this happen now that they have been made aware that the regression exists. Possible they might miss it here on Phoronix.

                This is a guess, but since the first version of X exhibiting the terrible performance was the first version to use DRI3, that might be related to the problem. On the other hand, going to a non-compositing window manager did not help at all, don't know if that has any bearing on it.
                Last edited by Luke; 17 March 2014, 03:03 PM.

                Comment


                • #58
                  Updates from 3/17 Xorg benchmarking:

                  On 3-17-2014, I updated X to the latest Trusty packages (xserver-xorg-core=xserver-xorg-core_2%3a1.15.0-1ubuntu7_amd64.deb) and retested. Running Mesa 10.2 with hyper-Z enabled, will test today's Oibaf PPA updates to it as soon as they download (THAT is slow on this connection).

                  The Critter benchmark may not be of any great importance as it is a 2d game in Opengl that runs very fast, but the regression is 355FPS max instead of 690 fps max-on the order of a 49% drop in framerate. This is absolutely repeatable (on two different machines) by leaving only one opponent on the screens and intentionally permitting all shields to be destroyed.

                  The Scorched3d benchmark gave inconsistant results. With the previous X server, I was getting 50-70fps, with the new version I sometimes got 25-35fps, but sometimes got right back to the 50-70fps range, though the highest speeds did not appear as often as with the older version of X. Scorched3d can be a bit difficult to benchmark as which map appears cannot be controlled.

                  In February I got nearly unplayable results in Scorched3d (11-25fps), though some of that was a since-resolved hardware issue and some was the hyper-Z issue with the first versions of Mesa 10.2 installed at that time. No change at all in Critter on Radeon, don't know if these results will translate into regressions on openGL loads I do not have or not.

                  On my Intel Atom netbook, by comparsion, Critter is barely playable due to dropped frames. When the new xserver came out, it was worse, I remember just over 60fps but with worse framedropping than ever. Last night about 110 fps on the netbook with fewer dropped frames. My conclusion is that some progress might be being made somewhere, but I don't know what changes in what package are helping if any.

                  Comment


                  • #59
                    Further tests with today's Mesa 10.2 (3-17-2014)

                    Originally posted by Luke View Post
                    will test today's Oibaf PPA updates to it as soon as they download (THAT is slow on this connection)..

                    Getting hard to reliably benchmark Scorched3d due to varying loads, but seemed to run a little faster than earlier test today. Still a bit inconsistant with some screens showing 25-30fps, but more of the screens now at 50-70, and saw 70fps a bit more.

                    Critter was interesting: little change in MAXIMUM framerate, but minimum framerate is now at 75% of maximum,with few drops below 300fps from 360, wich is 83% of minimim. Used to be maximum of 690fps with big drops under load to about (sometimes below) 400fps-all faster than now but much less steady with minumum at 55% or less of maximum. Again, this is not much of a benchmark due to high framerates on a 2d game, but still interesting.

                    It seems to me that driver work in Mesa for dri3 may be killing this bug one leg at a time. Anyone running more demanding games should see if the Critter slowdown translates into slowdowns on their games, however!

                    Comment


                    • #60
                      Luke: it might get fixed without a bisect, too, but the bisect would rise the chances by a large amount cause the devs will know exactly where the root of the problem is. Also bisecting is extremely simple, after you've done it you'll laugh about how simple it was. All you need are 3 commands: The first one to tell git that you want to bisect, then 2 commands to tell git the version it gave you is either good or bad.

                      Comment

                      Working...
                      X