Announcement

Collapse
No announcement yet.

Likely Radeon Gallium3D Regression On Linux 3.14 + Mesa 10.2

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Michael and interested it seems that going back to arch 3.13 default kernel and forcing active hyperz give me back full FPS as before, using mesa git ofc

    Radeon 7770 2GB

    Comment


    • #42
      Originally posted by smitty3268 View Post
      Hmm, are you sure this is actually the same problem? I mean it could easily be that there is a real regression in the r200 driver, and this test is just showing the hyperz change for r600g/radeonsi, right?

      Well, i hope it is the same, because then we might see it get fixed and have everyone's performance go back up.
      For majority of tests in this article yes manual hyperz enable will just bring back performance of enabled hyperz , but not in all cases ;D... i assume that will not happen with triangle test and maybe Pray .

      Comment


      • #43
        Worst of all regressions was with the new Xserver, Mesa was about 20%

        Originally posted by curaga View Post
        Michael, this is not completely from HyperZ and so still worth investigating. We have reports from Luke and dungeon on this forum that confirm there's another regression besides the intended hyperz change.

        Luke specifically tested it, dungeon's case is media apps that do not use the Z buffer.
        In my tests, I found that with Mesa 10.2/Kernel 3.13/Xserver 1.15, framerates were cut in half. Plymouth crashes with the early Linux3.14 rc1 that I tried, blocking my disk decryption system so I have yet to test the new kernel. With Mesa 10.2 but reverting to Xserver 1.14, I got back most of the regression but was still down 10-20%. That was the part that apparently turned out to by from hyper-z in my tests.

        I got essentialy identical performance using Mesa 10.1 or Mesa 10.2 with hyper-z force-enabled in Critter (only one Z value, it's a 2d game) , but in Scorched3d,

        R600_HYPERZ=1 scorched3d

        locked up the whole x server with Hyper-Z and Mesa 10.2, with either version of X. I checked the exact same code with Mesa 10.1, no problems at all in Scorched3d.

        I am guessing the issue with the new X server relates either to the DRI 3 changeover or to the rewrite of the GLX system referred to in the changelogs.

        Comment


        • #44
          Originally posted by Luke View Post
          In my tests, I found that with Mesa 10.2/Kernel 3.13/Xserver 1.15, framerates were cut in half. Plymouth crashes with the early Linux3.14 rc1 that I tried, blocking my disk decryption system so I have yet to test the new kernel. With Mesa 10.2 but reverting to Xserver 1.14, I got back most of the regression but was still down 10-20%. That was the part that apparently turned out to by from hyper-z in my tests.

          I got essentialy identical performance using Mesa 10.1 or Mesa 10.2 with hyper-z force-enabled in Critter (only one Z value, it's a 2d game) , but in Scorched3d,

          R600_HYPERZ=1 scorched3d

          locked up the whole x server with Hyper-Z and Mesa 10.2, with either version of X. I checked the exact same code with Mesa 10.1, no problems at all in Scorched3d.

          I am guessing the issue with the new X server relates either to the DRI 3 changeover or to the rewrite of the GLX system referred to in the changelogs.
          You can do a git bisect to find out which commit introduced the regression.

          Comment


          • #45
            A git bisect is beyond my skills

            Originally posted by AnAkIn View Post
            You can do a git bisect to find out which commit introduced the regression.
            Sorry about that, but I would not know the slightest thing about doing that. Surely someone out there does,
            given how many people use Mesa and X.

            Comment


            • #46
              Originally posted by Luke View Post
              Sorry about that, but I would not know the slightest thing about doing that. Surely someone out there does,
              given how many people use Mesa and X.
              Here's an excellent how-to for using the git bisect commands to find the commit which caused the regression. Fairly simple, really:
              http://git-scm.com/book/en/Git-Tools-Debugging-with-Git

              Comment


              • #47
                I've never even compiled Mesa from source

                Originally posted by andyprough View Post
                Here's an excellent how-to for using the git bisect commands to find the commit which caused the regression. Fairly simple, really:
                http://git-scm.com/book/en/Git-Tools-Debugging-with-Git
                Not simple from where I sit. I've compiled Audacity and Kdenlive from source, but never Mesa. I don't have the bandwidth at home to be pulling a lot of build dependancies, have to get system
                updates on the road, etc. Not very likely I wll be the one to do this-especially when Mesa 10.1 works just fine for me and taking on a big new project is not something I can do right now.
                Last edited by Luke; 03-03-2014, 10:51 PM.

                Comment


                • #48
                  Originally posted by Luke View Post
                  Not simple from where I sit. I've compiled Audacity and Kdenlive from source, but never Mesa. I don't have the bandwidth at home to be pulling a lot of build dependancies, have to get system updates on the road, etc.
                  I have to agree that Mesa git repository is quite bulky and took me a few hours to successfully git clone it.

                  Comment


                  • #49
                    was this fixed in 3.14-rc6? I remember there was a bug report, but I can't find it :/
                    Last edited by asdfblah; 03-10-2014, 12:58 PM.

                    Comment


                    • #50
                      Regressions I found were not in the kernel

                      Originally posted by asdfblah View Post
                      was this fixed in 3.14-rc6? I remember there was a bug report, but I can't find it :/
                      I was able to roll back the regressions by rolling back Xorg and Mesa. Xorg was the worst offender, in Mesa only the hyper-Z issue caused a slowdown but Scorched3d locks up on the HD6750 if you use hyper-Z with Mesa 10.2. Works fine with Mesa 10.1, and both work fine with hyper-Z on the Radeon HD5570.

                      I first installed the 3.14 kernels after this point, at first could not use them because of a GPU power management firmware bug. Last week a new Lnux-firmware package fixed that, and I found no performance regressions between the 3.13 and 3.14 kernels on my systems with the use cases I have. All the problems were in Mesa and X.

                      Comment

                      Working...
                      X