Basically it is possible that KDE 4 effects (or Compiz) can show memory leaks in in grafic drivers, in that case the memory usage for Xorg will increase dramatically. Mostly older Nv drivers seemed to be affected by that so running KDE 4 on old machines with low memory was much slower than with KDE 3.5. I tested my own live systems now with a very simple test: show memory usage directly after boot using the integrated infobash tool (inside vbox with 980 mb ram) - rounded, basically it is changeing a bit with each run, but you see the tendency.
kde 3.5
32 bit: 129 mb
64 bit: 212 mb
kde 4.3 (backport for lenny):
32 bit: 179 mb
64 bit: 276 mb
Announcement
Collapse
No announcement yet.
Power & Memory Usage Of GNOME, KDE, LXDE & Xfce
Collapse
X
-
Originally posted by V!NCENT View PostIs your subconscious also hinting your conscious that we are dealing with a autistic person here? X'D
An inability to grasp the greater goings on and instead focus on minutia to the detriment of the larger argument.
(apologies for the rambling sentences.)
Leave a comment:
-
Originally posted by mugginz View PostSurely it's mostly to inform the readers whether a particular desktop environment has a meaningful impact on the quality of experience for an end user due to excessive memory consumption relative to the other choices available to an individual.
Leave a comment:
-
Originally posted by << ⚛ >> View PostNo, it is not clear to me. I agree it is maybe clear to yourself, but if you publish it subjective clarity is not sufficient.
Originally posted by << ⚛ >> View Post"Used memory" is a very broad term.
Originally posted by << ⚛ >> View Post[B]But the cache may share some physical memory pages with executables and libraries. Did you know about that?)
Given the size of error introduced by the 'cached' element relative to the other values I felt it reasonable to go with my current methodology. Remember what is trying to be determined.
Surely it's mostly to inform the readers whether a particular desktop environment has a meaningful impact on the quality of experience for an end user due to excessive memory consumption relative to the other choices available to an individual.
Leave a comment:
-
Originally posted by << ⚛ >> View PostCan you show me an example of a non-deterministic Linux compiler? (Different timestamps, version infos, etc., do not count.)
Oh and the CPU cash is also deterministic so while we are at it: do you have an undeterministic compute unit to test it on, because by your logic your own RAM tests fail also
Leave a comment:
-
Originally posted by << ⚛ >> View PostI don't see any monkeys in your reply.
Monkeys are essential in making a solid argument, you know ...
Which leads me to why BlackStar is winning:
We are engaging in a social discussion and BlackStar is awesome at trolling and you failed to grasp that. A social event (quantum physics -> overruling) can't be concluded because it is not logic, but the result of logic and avarages.
I (1) and BlackStar (2) versus you (3).
(2/3) > (1/3) = you lose.
Leave a comment:
-
Guest repliedMonkey intelligence
Originally posted by V!NCENT View PostFact 4: This has nothing to do with IQ
Fact 5: You just want to win
Fact 6: You failed
Leave a comment:
-
Guest repliedNondeteministic compilers ?
Originally posted by V!NCENT View PostThen compare them in percentages to each other (because each different compile may differ in size of the binairies...)
Leave a comment:
-
Originally posted by << ⚛ >> View PostFact 1: BlackStar wrote that adding a constant does not skew results.
Fact 2: Phoronix uses percentages in the article.
Fact 3: Adding a constant to a divisor or a dividend skews results.
Conclusion: BlackStar is wrong.
Fact 5: You just want to win
Fact 6: You failed
Leave a comment:
-
Guest repliedMonkey intelligence
Originally posted by V!NCENT View Post
Originally posted by << ⚛ >> View Post
Originally posted by BlackStar View PostThat said, you seem to miss the fact that syslog and similar processes add a constant amount of overhead that affects all measurements. They don't skew results one way or another when comparing DEs (as was done here).
Fact 2: Phoronix uses percentages in the article.
Fact 3: Adding a constant to a divisor or a dividend skews results.
Conclusion: BlackStar is wrong.
Now you are going to reply that an average monkey can learn in two weeks what I just said and can contemplate about it in internet forums. OK, simply go on, make a reply about those monkeys.
Leave a comment:
Leave a comment: