Page 5 of 7 FirstFirst ... 34567 LastLast
Results 41 to 50 of 68

Thread: Likely Radeon Gallium3D Regression On Linux 3.14 + Mesa 10.2

  1. #41
    Join Date
    Jun 2009
    Posts
    1,134

    Default

    Michael and interested it seems that going back to arch 3.13 default kernel and forcing active hyperz give me back full FPS as before, using mesa git ofc

    Radeon 7770 2GB

  2. #42
    Join Date
    Feb 2008
    Posts
    928

    Default

    Quote Originally Posted by smitty3268 View Post
    Hmm, are you sure this is actually the same problem? I mean it could easily be that there is a real regression in the r200 driver, and this test is just showing the hyperz change for r600g/radeonsi, right?

    Well, i hope it is the same, because then we might see it get fixed and have everyone's performance go back up.
    For majority of tests in this article yes manual hyperz enable will just bring back performance of enabled hyperz , but not in all cases ;D... i assume that will not happen with triangle test and maybe Pray .

  3. #43
    Join Date
    May 2013
    Posts
    533

    Default Worst of all regressions was with the new Xserver, Mesa was about 20%

    Quote Originally Posted by curaga View Post
    Michael, this is not completely from HyperZ and so still worth investigating. We have reports from Luke and dungeon on this forum that confirm there's another regression besides the intended hyperz change.

    Luke specifically tested it, dungeon's case is media apps that do not use the Z buffer.
    In my tests, I found that with Mesa 10.2/Kernel 3.13/Xserver 1.15, framerates were cut in half. Plymouth crashes with the early Linux3.14 rc1 that I tried, blocking my disk decryption system so I have yet to test the new kernel. With Mesa 10.2 but reverting to Xserver 1.14, I got back most of the regression but was still down 10-20%. That was the part that apparently turned out to by from hyper-z in my tests.

    I got essentialy identical performance using Mesa 10.1 or Mesa 10.2 with hyper-z force-enabled in Critter (only one Z value, it's a 2d game) , but in Scorched3d,

    R600_HYPERZ=1 scorched3d

    locked up the whole x server with Hyper-Z and Mesa 10.2, with either version of X. I checked the exact same code with Mesa 10.1, no problems at all in Scorched3d.

    I am guessing the issue with the new X server relates either to the DRI 3 changeover or to the rewrite of the GLX system referred to in the changelogs.

  4. #44
    Join Date
    Dec 2012
    Posts
    36

    Default

    Quote Originally Posted by Luke View Post
    In my tests, I found that with Mesa 10.2/Kernel 3.13/Xserver 1.15, framerates were cut in half. Plymouth crashes with the early Linux3.14 rc1 that I tried, blocking my disk decryption system so I have yet to test the new kernel. With Mesa 10.2 but reverting to Xserver 1.14, I got back most of the regression but was still down 10-20%. That was the part that apparently turned out to by from hyper-z in my tests.

    I got essentialy identical performance using Mesa 10.1 or Mesa 10.2 with hyper-z force-enabled in Critter (only one Z value, it's a 2d game) , but in Scorched3d,

    R600_HYPERZ=1 scorched3d

    locked up the whole x server with Hyper-Z and Mesa 10.2, with either version of X. I checked the exact same code with Mesa 10.1, no problems at all in Scorched3d.

    I am guessing the issue with the new X server relates either to the DRI 3 changeover or to the rewrite of the GLX system referred to in the changelogs.
    You can do a git bisect to find out which commit introduced the regression.

  5. #45
    Join Date
    May 2013
    Posts
    533

    Default A git bisect is beyond my skills

    Quote Originally Posted by AnAkIn View Post
    You can do a git bisect to find out which commit introduced the regression.
    Sorry about that, but I would not know the slightest thing about doing that. Surely someone out there does,
    given how many people use Mesa and X.

  6. #46
    Join Date
    Feb 2012
    Location
    Austin, TX, USA
    Posts
    43

    Default

    Quote Originally Posted by Luke View Post
    Sorry about that, but I would not know the slightest thing about doing that. Surely someone out there does,
    given how many people use Mesa and X.
    Here's an excellent how-to for using the git bisect commands to find the commit which caused the regression. Fairly simple, really:
    http://git-scm.com/book/en/Git-Tools-Debugging-with-Git

  7. #47
    Join Date
    May 2013
    Posts
    533

    Default I've never even compiled Mesa from source

    Quote Originally Posted by andyprough View Post
    Here's an excellent how-to for using the git bisect commands to find the commit which caused the regression. Fairly simple, really:
    http://git-scm.com/book/en/Git-Tools-Debugging-with-Git
    Not simple from where I sit. I've compiled Audacity and Kdenlive from source, but never Mesa. I don't have the bandwidth at home to be pulling a lot of build dependancies, have to get system
    updates on the road, etc. Not very likely I wll be the one to do this-especially when Mesa 10.1 works just fine for me and taking on a big new project is not something I can do right now.
    Last edited by Luke; 03-03-2014 at 10:51 PM.

  8. #48
    Join Date
    Oct 2013
    Posts
    195

    Default

    Quote Originally Posted by Luke View Post
    Not simple from where I sit. I've compiled Audacity and Kdenlive from source, but never Mesa. I don't have the bandwidth at home to be pulling a lot of build dependancies, have to get system updates on the road, etc.
    I have to agree that Mesa git repository is quite bulky and took me a few hours to successfully git clone it.

  9. #49
    Join Date
    Oct 2012
    Posts
    205

    Default

    was this fixed in 3.14-rc6? I remember there was a bug report, but I can't find it :/
    Last edited by asdfblah; 03-10-2014 at 12:58 PM.

  10. #50
    Join Date
    May 2013
    Posts
    533

    Default Regressions I found were not in the kernel

    Quote Originally Posted by asdfblah View Post
    was this fixed in 3.14-rc6? I remember there was a bug report, but I can't find it :/
    I was able to roll back the regressions by rolling back Xorg and Mesa. Xorg was the worst offender, in Mesa only the hyper-Z issue caused a slowdown but Scorched3d locks up on the HD6750 if you use hyper-Z with Mesa 10.2. Works fine with Mesa 10.1, and both work fine with hyper-Z on the Radeon HD5570.

    I first installed the 3.14 kernels after this point, at first could not use them because of a GPU power management firmware bug. Last week a new Lnux-firmware package fixed that, and I found no performance regressions between the 3.13 and 3.14 kernels on my systems with the use cases I have. All the problems were in Mesa and X.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •