Page 6 of 7 FirstFirst ... 4567 LastLast
Results 51 to 60 of 68

Thread: Likely Radeon Gallium3D Regression On Linux 3.14 + Mesa 10.2

  1. #51
    Join Date
    Oct 2012
    Posts
    206

    Default

    Quote Originally Posted by Luke View Post
    I was able to roll back the regressions by rolling back Xorg and Mesa. Xorg was the worst offender, in Mesa only the hyper-Z issue caused a slowdown but Scorched3d locks up on the HD6750 if you use hyper-Z with Mesa 10.2. Works fine with Mesa 10.1, and both work fine with hyper-Z on the Radeon HD5570.

    I first installed the 3.14 kernels after this point, at first could not use them because of a GPU power management firmware bug. Last week a new Lnux-firmware package fixed that, and I found no performance regressions between the 3.13 and 3.14 kernels on my systems with the use cases I have. All the problems were in Mesa and X.
    Yes? I thought the problem was in the kernel...

  2. #52
    Join Date
    May 2013
    Posts
    533

    Default Not what I have seen

    Quote Originally Posted by asdfblah View Post
    Yes? I thought the problem was in the kernel...
    I've heard reports of a regressio there but never seen it. The X regression is so bad I had to pin Xorg in the machine I use for games

  3. #53
    Join Date
    Oct 2012
    Posts
    206

    Default

    Quote Originally Posted by Luke View Post
    I've heard reports of a regressio there but never seen it. The X regression is so bad I had to pin Xorg in the machine I use for games
    Well, it was this thread: http://www.phoronix.com/forums/showt...untu-14-04-LTS
    I don't get it, does it affect ATI/AMD every card, or just cards < r600 ?

  4. #54
    Join Date
    May 2013
    Posts
    533

    Default I have an HD6750 and and HD5570 that have been tested

    Quote Originally Posted by asdfblah View Post
    Well, it was this thread: http://www.phoronix.com/forums/showt...untu-14-04-LTS
    I don't get it, does it affect ATI/AMD every card, or just cards < r600 ?
    In my tests, no regression from the kernel update to 3.14, only the hyper-Z regression on the Mesa 10.1 to Mesa 10.2 transition, and the newest versions of Mesa 10.2 will tolerate enabling hyper-Z in ~/.profile on both the cards I have tested. The X regression is the worst of all, still present as of last week. When I get a version of X on the second machine (with the HD5570) that restores full performance, I will know that one has been found and fixed. Until then X is pinned on my best machine.

    The only games I have that impose a significant OpenGL load are Scorched3d, Critter (load due to high framerate) and 0ad(mostly CPU limited except at the very start). I don't have the games usually benchmarked, so a kernel regression not affecting these games could be present and I would not see it

  5. #55
    Join Date
    Dec 2007
    Posts
    2,371

    Default

    Quote Originally Posted by Luke View Post
    The X regression is the worst of all, still present as of last week. When I get a version of X on the second machine (with the HD5570) that restores full performance, I will know that one has been found and fixed. Until then X is pinned on my best machine.
    Can you bisect the xserver and identify what commit caused the regression?

  6. #56
    Join Date
    May 2013
    Posts
    533

    Default Not likely

    Quote Originally Posted by agd5f View Post
    Can you bisect the xserver and identify what commit caused the regression?
    I've never done something like that-and I have throttled cellular bandwidth only, no land line connection

  7. #57
    Join Date
    Jul 2012
    Posts
    152

    Default

    xorg is a poor graphic server. how long for wayland!? regressiona are unnerving.

  8. #58
    Join Date
    Oct 2010
    Posts
    311

    Default

    Quote Originally Posted by Luke View Post
    I've never done something like that-and I have throttled cellular bandwidth only, no land line connection
    Once you have a git checkout doing the bisect doesn't require a internet connection. If you don't use X from git but only releases than you'd better find a wifi hotspot for the initial checkout. As for how to do the bisect itself here is a tutorial . It's for the kernel, not X, but the commands would be the same.

  9. #59
    Join Date
    May 2013
    Posts
    533

    Default I've never built X from source at all

    Quote Originally Posted by Ansla View Post
    Once you have a git checkout doing the bisect doesn't require a internet connection. If you don't use X from git but only releases than you'd better find a wifi hotspot for the initial checkout. As for how to do the bisect itself here is a tutorial . It's for the kernel, not X, but the commands would be the same.
    I have never built X or Mesa from source, I run them from PPA's. I've never done anything with GIT other than download small source programs, never done a bisect of anything. If the only way this gets fixed is for me to learn to do a git bisect, dedicate a partitiion to building X and installing all the build dependencies and all the rest, it won't happen. I've never done this, wonder if I should simply stop posting my reports of regressions here, as I am not getting repeated requests for things beyond my ability.

  10. #60
    Join Date
    May 2013
    Posts
    533

    Default I've filed a bug report with Ubuntu/Launchpad against their version

    Quote Originally Posted by Ansla View Post
    Once you have a git checkout doing the bisect doesn't require a internet connection. If you don't use X from git but only releases than you'd better find a wifi hotspot for the initial checkout. As for how to do the bisect itself here is a tutorial . It's for the kernel, not X, but the commands would be the same.
    I filed this bug report:

    https://bugs.launchpad.net/ubuntu/+s...r/+bug/1293314

    For my own purposes it is enough to run the xserver used by Ubuntu 13.10 with current Mesa from the Oibaf PPA and the current kernel from PPA. For Canonical, however, they need to either get this fixed upstream, patch it themselves, or do as I am doing and reuse the older X server. Otherwise an LTS version of Ubuntu (14.04) will appear where proprietary drivers give no more performance than Mesa did a year ago, and Mesa performance is half what is used to be. Debian will get this too-as will any Steamboxes built on that version. I can't imagine they will be dumb enough to let this happen now that they have been made aware that the regression exists. Possible they might miss it here on Phoronix.

    This is a guess, but since the first version of X exhibiting the terrible performance was the first version to use DRI3, that might be related to the problem. On the other hand, going to a non-compositing window manager did not help at all, don't know if that has any bearing on it.
    Last edited by Luke; 03-17-2014 at 03:03 PM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •