Announcement

Collapse
No announcement yet.

Likely Radeon Gallium3D Regression On Linux 3.14 + Mesa 10.2

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Luke
    replied
    Downgrades easy when all .deb packages are kept

    Originally posted by Ansla View Post
    I don't understand how you are able to downgrade just the X server without building from sources. The ABI version has changed between xorg-server 1.14 and 1.15 so the same binary xf86-video-ati driver should not load in both versions, though the same version could be built for both ABIs. So are you also upgrading/downgrading the ati X driver in the process?
    I never throw out .deb packages. They are updated on the road on one machine, then kept for installation on all others. After that an update day's directory of debs is simply kept. They use little space in video editing machines with 4TB file systems, I've got them all the way back to Jaunty.

    OK, here's how I find regressions, think of this as a "precompiled binary bisect": If something breaks or works poorly after an update, look at packages that have been updated since the last known good run. Pull up gnome-search tool, find the previous versions of those wherever you keep your old packages. Copy them all into a rollback folder, in this case that began with the last previous versions of each Xorg related package in one directory, all of Mesa in another. By always using the last previous versions you get compatability. Needless to say, this was ALL of the previous X server!

    Rollback is then simply cd into each directory, sudo dpkg -i --force-all *, then open synaptic to find any broken packages that require other packages to be rolled back. Keep all the rollback packages in one directory, current versions in another. Now you can switch them back and forth, and when multiple package families like Mesa and X are involved mix and match them if they are compatable. That's how I found that most of the problem was in February's X updates, but about 20% in the default disablement of hyper-Z in Mesa. It's also why I could easily update again this week, knowing I could just as quickly roll back if I needed to. Results reported earlier in this thread indicate better results as of now, enough to keep the current X server this time around,

    Needless to say, if you throw away packages you can't do any of this. Usually it's been Kdenlive or Audacity issues I've done this on, dealing with X was a lot of packages to dig up, at least I had them on hand and gnome-search-tool could find them.
    Last edited by Luke; 20 March 2014, 01:52 PM.

    Leave a comment:


  • Ansla
    replied
    Originally posted by Luke View Post
    Before I would have any confidence trying to do something like this with X, the kernel, or similar large system programs, I would need to start with something simple with a small volume of code not requiring a road trip for bandwidth, etc. Probably a known application with a known, recent and severe bug(fixed or otherwise), not a lot to compile, not a lot of build-deps to chase down and remove later, where whether or not a particular version works is instantly obvious. In short, a practice target, not something major that I've never screwed around with building from source.
    I don't understand how you are able to downgrade just the X server without building from sources. The ABI version has changed between xorg-server 1.14 and 1.15 so the same binary xf86-video-ati driver should not load in both versions, though the same version could be built for both ABIs. So are you also upgrading/downgrading the ati X driver in the process?

    Leave a comment:


  • bridgman
    replied
    For what it's worth, I don't think you were actually being pressured. Your initial responses read more like "I don't know how to do it" (along with a couple of concerns like additional network traffic for bisecting) than "I don't want to do it" so a bunch of people piled on and responded to the specific concerns you raised. It wasn't at all obvious to me, for example, that your concern was primarily downloading the git repos in the first place, not additional traffic for bisecting...

    ... and I'm running on a cellular modem at home so if anyone would have picked up on your real concerns at the start it should have been me
    Last edited by bridgman; 19 March 2014, 11:02 PM.

    Leave a comment:


  • Luke
    replied
    If I am going to play with GIT Bisect

    Originally posted by Luke View Post
    Nothing particular about the issue at hand (git bisect) but I've gotten WAY too much pressure over this.
    Before I would have any confidence trying to do something like this with X, the kernel, or similar large system programs, I would need to start with something simple with a small volume of code not requiring a road trip for bandwidth, etc. Probably a known application with a known, recent and severe bug(fixed or otherwise), not a lot to compile, not a lot of build-deps to chase down and remove later, where whether or not a particular version works is instantly obvious. In short, a practice target, not something major that I've never screwed around with building from source.

    Leave a comment:


  • Luke
    replied
    I will stop posting benchmarks here if this continues

    Originally posted by TAXI View Post
    Luke: it might get fixed without a bisect, too, but the bisect would rise the chances by a large amount cause the devs will know exactly where the root of the problem is. Also bisecting is extremely simple, after you've done it you'll laugh about how simple it was. All you need are 3 commands: The first one to tell git that you want to bisect, then 2 commands to tell git the version it gave you is either good or bad.
    Nothing particular about the issue at hand (git bisect) but I've gotten WAY too much pressure over this. I do not react well to that! I don't think I will post any more benchmarks here, this is just too much for me and I don't want to be part of something that pushes people like that. With so many people making the same demand, I don't want to put myself in this position again. After all, I am not some paid developer putting out product to order.

    As for the specifics, I've never compiled X or Mesa from source, do NOT have a landline connection for a lot of bandwidth, etc. I will leave this for others-fixed or not. As of right not it's working good enough for me with the most recent changes, and I don't even know whether the Critter benchmark will still translate into a slowdown in more demanding games I lack the bandwidth to download. Scorched 3d is now mostly back up to speed.

    Leave a comment:


  • V10lator
    replied
    Luke: it might get fixed without a bisect, too, but the bisect would rise the chances by a large amount cause the devs will know exactly where the root of the problem is. Also bisecting is extremely simple, after you've done it you'll laugh about how simple it was. All you need are 3 commands: The first one to tell git that you want to bisect, then 2 commands to tell git the version it gave you is either good or bad.

    Leave a comment:


  • Luke
    replied
    Further tests with today's Mesa 10.2 (3-17-2014)

    Originally posted by Luke View Post
    will test today's Oibaf PPA updates to it as soon as they download (THAT is slow on this connection)..

    Getting hard to reliably benchmark Scorched3d due to varying loads, but seemed to run a little faster than earlier test today. Still a bit inconsistant with some screens showing 25-30fps, but more of the screens now at 50-70, and saw 70fps a bit more.

    Critter was interesting: little change in MAXIMUM framerate, but minimum framerate is now at 75% of maximum,with few drops below 300fps from 360, wich is 83% of minimim. Used to be maximum of 690fps with big drops under load to about (sometimes below) 400fps-all faster than now but much less steady with minumum at 55% or less of maximum. Again, this is not much of a benchmark due to high framerates on a 2d game, but still interesting.

    It seems to me that driver work in Mesa for dri3 may be killing this bug one leg at a time. Anyone running more demanding games should see if the Critter slowdown translates into slowdowns on their games, however!

    Leave a comment:


  • Luke
    replied
    Updates from 3/17 Xorg benchmarking:

    On 3-17-2014, I updated X to the latest Trusty packages (xserver-xorg-core=xserver-xorg-core_2%3a1.15.0-1ubuntu7_amd64.deb) and retested. Running Mesa 10.2 with hyper-Z enabled, will test today's Oibaf PPA updates to it as soon as they download (THAT is slow on this connection).

    The Critter benchmark may not be of any great importance as it is a 2d game in Opengl that runs very fast, but the regression is 355FPS max instead of 690 fps max-on the order of a 49% drop in framerate. This is absolutely repeatable (on two different machines) by leaving only one opponent on the screens and intentionally permitting all shields to be destroyed.

    The Scorched3d benchmark gave inconsistant results. With the previous X server, I was getting 50-70fps, with the new version I sometimes got 25-35fps, but sometimes got right back to the 50-70fps range, though the highest speeds did not appear as often as with the older version of X. Scorched3d can be a bit difficult to benchmark as which map appears cannot be controlled.

    In February I got nearly unplayable results in Scorched3d (11-25fps), though some of that was a since-resolved hardware issue and some was the hyper-Z issue with the first versions of Mesa 10.2 installed at that time. No change at all in Critter on Radeon, don't know if these results will translate into regressions on openGL loads I do not have or not.

    On my Intel Atom netbook, by comparsion, Critter is barely playable due to dropped frames. When the new xserver came out, it was worse, I remember just over 60fps but with worse framedropping than ever. Last night about 110 fps on the netbook with fewer dropped frames. My conclusion is that some progress might be being made somewhere, but I don't know what changes in what package are helping if any.

    Leave a comment:


  • Luke
    replied
    I've filed a bug report with Ubuntu/Launchpad against their version

    Originally posted by Ansla View Post
    Once you have a git checkout doing the bisect doesn't require a internet connection. If you don't use X from git but only releases than you'd better find a wifi hotspot for the initial checkout. As for how to do the bisect itself here is a tutorial . It's for the kernel, not X, but the commands would be the same.
    I filed this bug report:

    With Xserver 1.15, OpenGL performance in all games is reduced by almost half, at least with both Radeon/r600g driver Not sure which actual package has the issue. As long a hyper-z is explicitly enabled, mesa version 10.2 (from oibaf PPA) does not change this regression. With hyper-z disabled (the new default) another 10-20% reduction in performance resulted after an approximately 40% loss from the new x server. Critter still ran faster than the screen refresh rate, so it was not a matter of...


    For my own purposes it is enough to run the xserver used by Ubuntu 13.10 with current Mesa from the Oibaf PPA and the current kernel from PPA. For Canonical, however, they need to either get this fixed upstream, patch it themselves, or do as I am doing and reuse the older X server. Otherwise an LTS version of Ubuntu (14.04) will appear where proprietary drivers give no more performance than Mesa did a year ago, and Mesa performance is half what is used to be. Debian will get this too-as will any Steamboxes built on that version. I can't imagine they will be dumb enough to let this happen now that they have been made aware that the regression exists. Possible they might miss it here on Phoronix.

    This is a guess, but since the first version of X exhibiting the terrible performance was the first version to use DRI3, that might be related to the problem. On the other hand, going to a non-compositing window manager did not help at all, don't know if that has any bearing on it.
    Last edited by Luke; 17 March 2014, 03:03 PM.

    Leave a comment:


  • Luke
    replied
    I've never built X from source at all

    Originally posted by Ansla View Post
    Once you have a git checkout doing the bisect doesn't require a internet connection. If you don't use X from git but only releases than you'd better find a wifi hotspot for the initial checkout. As for how to do the bisect itself here is a tutorial . It's for the kernel, not X, but the commands would be the same.
    I have never built X or Mesa from source, I run them from PPA's. I've never done anything with GIT other than download small source programs, never done a bisect of anything. If the only way this gets fixed is for me to learn to do a git bisect, dedicate a partitiion to building X and installing all the build dependencies and all the rest, it won't happen. I've never done this, wonder if I should simply stop posting my reports of regressions here, as I am not getting repeated requests for things beyond my ability.

    Leave a comment:

Working...
X