Announcement

Collapse
No announcement yet.

Power & Memory Usage Of GNOME, KDE, LXDE & Xfce

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Kano
    replied
    Basically it is possible that KDE 4 effects (or Compiz) can show memory leaks in in grafic drivers, in that case the memory usage for Xorg will increase dramatically. Mostly older Nv drivers seemed to be affected by that so running KDE 4 on old machines with low memory was much slower than with KDE 3.5. I tested my own live systems now with a very simple test: show memory usage directly after boot using the integrated infobash tool (inside vbox with 980 mb ram) - rounded, basically it is changeing a bit with each run, but you see the tendency.

    kde 3.5

    32 bit: 129 mb
    64 bit: 212 mb

    kde 4.3 (backport for lenny):

    32 bit: 179 mb
    64 bit: 276 mb

    Leave a comment:


  • mugginz
    replied
    Originally posted by V!NCENT View Post
    Is your subconscious also hinting your conscious that we are dealing with a autistic person here? X'D
    That's one possibility.

    An inability to grasp the greater goings on and instead focus on minutia to the detriment of the larger argument.

    (apologies for the rambling sentences.)

    Leave a comment:


  • V!NCENT
    replied
    Originally posted by mugginz View Post
    Surely it's mostly to inform the readers whether a particular desktop environment has a meaningful impact on the quality of experience for an end user due to excessive memory consumption relative to the other choices available to an individual.
    Is your subconscious also hinting your conscious that we are dealing with a autistic person here? X'D

    Leave a comment:


  • mugginz
    replied
    Originally posted by << ⚛ >> View Post
    No, it is not clear to me. I agree it is maybe clear to yourself, but if you publish it subjective clarity is not sufficient.
    It would only be unclear to someone who hadn't read my previous posts on the subject, somebody exercising a wanton desire to cloud the current discussion with irrelevance, or someone unable to grasp the entirety of what is being put forward.

    Originally posted by << ⚛ >> View Post
    "Used memory" is a very broad term.
    Not for the purposes of this discussion. It's quite well understood from the original article what is trying to be determined. And given the gravity (or mostly lack thereof) of the results on the DE choice of the readers here, some inaccuracies can be tolerated. The biggest problem I had with the original figures was that they were quite far off from what I had personally experienced, and so thought there was likely some large error. As it turns out, there was.

    Originally posted by << ⚛ >> View Post
    [B]But the cache may share some physical memory pages with executables and libraries. Did you know about that?
    If absolute accuracy was required I'd be passing a dump of /proc/meminfo. (no jokes about passing a dump )
    Given the size of error introduced by the 'cached' element relative to the other values I felt it reasonable to go with my current methodology. Remember what is trying to be determined.

    Surely it's mostly to inform the readers whether a particular desktop environment has a meaningful impact on the quality of experience for an end user due to excessive memory consumption relative to the other choices available to an individual.

    Leave a comment:


  • V!NCENT
    replied
    Originally posted by << ⚛ >> View Post
    Can you show me an example of a non-deterministic Linux compiler? (Different timestamps, version infos, etc., do not count.)
    Shall we also include spacetime while we are at it? Just use GCC...

    Oh and the CPU cash is also deterministic so while we are at it: do you have an undeterministic compute unit to test it on, because by your logic your own RAM tests fail also

    Leave a comment:


  • V!NCENT
    replied
    Originally posted by << ⚛ >> View Post
    I don't see any monkeys in your reply.
    That's correct...

    Monkeys are essential in making a solid argument, you know ...
    No, because they are not essential. They can, however, be used to explain solid arguments.

    Which leads me to why BlackStar is winning:
    We are engaging in a social discussion and BlackStar is awesome at trolling and you failed to grasp that. A social event (quantum physics -> overruling) can't be concluded because it is not logic, but the result of logic and avarages.

    I (1) and BlackStar (2) versus you (3).
    (2/3) > (1/3) = you lose.

    Leave a comment:


  • Guest
    Guest replied
    Monkey intelligence

    Originally posted by V!NCENT View Post
    Fact 4: This has nothing to do with IQ
    Fact 5: You just want to win
    Fact 6: You failed
    I don't see any monkeys in your reply. Monkeys are essential in making a solid argument, you know ...

    Leave a comment:


  • Guest
    Guest replied
    Nondeteministic compilers ?

    Originally posted by V!NCENT View Post
    Then compare them in percentages to each other (because each different compile may differ in size of the binairies...)
    Can you show me an example of a non-deterministic Linux compiler? (Different timestamps, version infos, etc., do not count.)

    Leave a comment:


  • V!NCENT
    replied
    Originally posted by << ⚛ >> View Post
    Fact 1: BlackStar wrote that adding a constant does not skew results.
    Fact 2: Phoronix uses percentages in the article.
    Fact 3: Adding a constant to a divisor or a dividend skews results.
    Conclusion: BlackStar is wrong.
    Fact 4: This has nothing to do with IQ
    Fact 5: You just want to win
    Fact 6: You failed

    Leave a comment:


  • Guest
    Guest replied
    Monkey intelligence

    Originally posted by V!NCENT View Post

    Originally posted by << ⚛ >> View Post

    Originally posted by BlackStar View Post
    That said, you seem to miss the fact that syslog and similar processes add a constant amount of overhead that affects all measurements. They don't skew results one way or another when comparing DEs (as was done here).
    What is your IQ, my dear?
    I'm sorry, but that rhetorical question, in itself, is extremely stupid and says more about you than about BlackStar...
    Fact 1: BlackStar wrote that adding a constant does not skew results.
    Fact 2: Phoronix uses percentages in the article.
    Fact 3: Adding a constant to a divisor or a dividend skews results.
    Conclusion: BlackStar is wrong.

    Now you are going to reply that an average monkey can learn in two weeks what I just said and can contemplate about it in internet forums. OK, simply go on, make a reply about those monkeys.

    Leave a comment:

Working...
X