Originally posted by Alistair
View Post
Announcement
Collapse
No announcement yet.
AMD fglrx 8.42.3 leaking gobs of memory in OpenGL apps - any known workaround ?
Collapse
X
-
-
Originally posted by TheIcebreaker View PostX200 chipset (intel) FC4 Xorg 6.8.2
Comment
-
Originally posted by Snake View PostWe're on linux, where a bug is no "embarrassing taboo", but something that happens all the time (me being the exception, of course ).
I blame their employer, formerly known as ATI for not even remotely taking this seriously enough when they were ATI. I blame their employer, now known as AMD, for not taking this seriously enough when they took the company over. I can only hope they realize that what we're being handed would pretty much nuke them from orbit in the Windows world and it's about to do that in what might be one of their only future markets.
Comment
-
Interesting Observations
A few interesting (to me, at least) commonalities based on my own informal testing and the posts of other members:
The memory leak seems to be directly proportional to frame rates; running glxgears, I observed that increasing the frame rate (by shrinking or hiding the window) proportionally increased the memory consumption, while decreasing frame rate (by expanding the window or moving it around on the screen) decreases both frame rate and the rate of memory loss.
The size and complexity of the frame being drawn seems to have no impact on the rate of memory loss (other than slowing it down by reducing frame rate); fgl_glxgears runs slower and loses memory slower than glxgears; running Doom3 (on my slow X1400M, at least), the leak is almost not noticeable.
The trend of posts I have read seems to indicate the older or lower-end cards do not suffer from the leak or that the leak is so much reduced as to be unnoticeable.
This seems to indicate that the leak is tied to code that runs a relatively fixed number of times per frame swap / redraw. It also makes me wonder if the leak is tied to a portion of the driver code used only by cards supporting a more recent / advanced feature. The fact that the biggest leak seems to originate from somewhere in XF86DRIGetDeviceInfo (according to valgrind) might bear this out...
Ironically, it seems that the more advanced / expensive the card, the faster the frame rate and (consequently) the faster the memory leak.
Comment
-
Here's the leak summary of my valgrind run..The definitely lost line is much smaller than snakes.
Guess my x800pro/420 chipset isn't showing much of a hit.
I wonder if I ran x86_64 if it would be more pronounced.
Code:=16992== LEAK SUMMARY: ==16992== definitely lost: 216 bytes in 63 blocks. ==16992== indirectly lost: 2,104 bytes in 8 blocks. ==16992== possibly lost: 1,488 bytes in 32 blocks. ==16992== still reachable: 18,362,334 bytes in 2,930 blocks. ==16992== suppressed: 0 bytes in 0 blocks. ==16992== Reachable blocks (those to which a pointer was found) are not shown. ==16992== To see them, rerun with: --leak-check=full --show-reachable=yes
Those who would give up Essential Liberty to purchase a little Temporary Safety,deserve neither Liberty nor Safety.
Ben Franklin 1755
Comment
-
Beat this!
glxgears memory usage as reported by 'ps' on my Thinkpad Z61m with an ATI X1400 Mobility:
Time (s) --- RAM: VSZ / RSS (KB)
=====================================
0 --- 21744 / 9952
5 --- 53368 / 43896
10 --- 67496 / 58884
15 --- 81752 / 73188
20 --- 95744 / 87248
25 --- 109736 / 101232
30 --- 123728 / 115240
35 --- 137720 / 129240
40 --- 151712 / 143256
45 --- 165704 / 157248
50 --- 179696 / 171252
55 --- 193688 / 185212
60 --- 207812 / 199408
Hmm... so after the fast bump in memory usage during the initial 5 seconds, glxgears grabs 14 about megs of memory per 5 seconds. After one minute, glxgears has grabbed 200 megs. That is the single most disastrous memory leak I have ever seen. Nice.
As someone mentioned before, this is presumably a single simple bug being iterated over and over. It shouldn't be to difficult for ATI/AMD to fix this.
Comment
-
Because they don't Valgrind or Oprofile things and the QA people probably didn't test against the cards that seem to have the serious leak issue.
this really should have come out in beta tests. unless we're expected to be betatesters now.
on the other hand i guess that's what the beta warning in release notes page is for.
Comment
-
Originally posted by yoshi314 View Postat times like this i usually think to myself "what the hell is that betatester program for?"
this really should have come out in beta tests. unless we're expected to be betatesters now.
Comment
Comment