Announcement

Collapse
No announcement yet.

GNOME's GLib Adds GMemoryMonitor As Another Step In Helping Cope With Linux RAM Pressure

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by blacknova View Post
    And what is import and not important? On server it is at least predictable and can be set by adjusting OOM score, on desktop or workstation it is much less.
    Not really. Now none does that on desktops I can agree to that. But on desktops none uses IO priority levels either for that matter and if they did you would not need things like bfq.

    The solution is actually using the goddamn OOM score by either setting up a "rule of thumb" list or letting the user choose his important applications, and so on.

    No, I'd stand by mine opinion, offending application should fail by itself, it should gracefully handle OOM situation and either shutdown or free unimportant memory usage (caches/etc) and attempt allocation again.
    Your opinion is wrong, already running applications can allocate memory too, it's not just newly opened ones.

    Let's make an example: some application is wasting RAM, some important process (say the GUI) is allocating some more RAM and finds no free. Clearing caches is only so much useful and it still needs more RAM. What you do? Shut down the GUI? Implement some OOM logic in the GUI to terminate the less important processes within itself and/or degrade performance to save RAM?

    You are just moving the burden of OOM to each application, while not solving the overall issue.
    Last edited by starshipeleven; 18 December 2019, 05:37 AM.

    Comment


    • #12
      Your opinion is wrong, already running applications can allocate memory too, it's not just newly opened ones.
      Never said it is only new applications, application should consider of memory allocation failure at any point of its life cycle.

      Let's make an example: some application is wasting RAM, some important process (say the GUI) is allocating some more RAM and finds no free. Clearing caches is only so much useful and it still needs more RAM. What you do? Shut down the GUI? Implement some OOM logic in the GUI to terminate the less important processes within itself and/or degrade performance to save RAM?
      And who is to decide if application is wasting RAM? What is "wasting RAM" anyway? We either have a faulty application which just leaks or we have application which uses a lot RAM by design (faulty or not). It is not like RAM is marked by application as waste. Now on GUI application... GUI application is by design can interract with user, nothing is stopping it from reporting problem - saying that it doesn't have enough RAM it can either abort operation which require allocation or let user free RAM by killing "wasteful" application and repeating attempt to alloc more memory. And user might wan't to stop and wait for "wasteful" application to finish anyway as it might be more important than openning a new tab in a browser...

      And yes, in theory it is possible to create some sort of statistics which can track normal and abnormal application behavior wrt RAM, CPU and IO usage, that it would be possible to decide if application is misbehaving and suggest it for the kill intelligently, not that such thing is currently available.

      Comment


      • #13
        My main problem with OOM killer is that it's behavior is non-determenistic is general environment, doesn't mean it is not useful is some specific and controlled environment where it's behavior can be properly controlled - i.e. on server.

        Comment


        • #14
          Originally posted by blacknova View Post
          And who is to decide if application is wasting RAM? What is "wasting RAM" anyway?
          Using RAM for processes of lesser importance than keeping the system responsive to input

          Really the primary function for any system is to remain responsive to input, anything else is secondary to that.

          If Windows goes OOM it will basically lock up because of ossessive paging/swapping, if you have no pagefile something will crash.

          Now on GUI application... GUI application is by design can interract with user, nothing is stopping it from reporting problem
          If it cannot allocate RAM it will have issues in trying to report the problem. Also it may very be too slow to wait for the user actions.

          When Windows reports the "low ram situation" popup it's NOT out of memory but it's relatively close to it.

          And user might wan't to stop and wait for "wasteful" application to finish anyway as it might be more important than openning a new tab in a browser...
          This should be dealt with by a Windows-like "Low RAM popup" that informs the user the system is low on RAM and that he should limit RAM use. If this goes on like that it will start killing off processes.

          Really it is a very simple application to make too as it's just a non-GUI watchdog looking at RAM levels and triggering a GUI popup and notification if it becomes less than X MB.

          Comment


          • #15
            Originally posted by blacknova View Post
            My main problem with OOM killer is that it's behavior is non-determenistic is general environment, doesn't mean it is not useful is some specific and controlled environment where it's behavior can be properly controlled - i.e. on server.
            OOM killer triggers only in emergency situations when the system is at risk of locking up, and at those times killing an application is better than locking up the system and forcing the user to reboot the system (which is what happens with Windows)

            Comment


            • #16
              Originally posted by blacknova View Post

              And what is import and not important? On server it is at least predictable and can be set by adjusting OOM score, on desktop or workstation it is much less. No, I'd stand by mine opinion, offending application should fail by itself, it should gracefully handle OOM situation and either shutdown or free unimportant memory usage (caches/etc) and attempt allocation again.
              I don't understand what you mean. There's no offending application. An application allocates and frees memory all the time, so if a highly efficient app that does not abuse RAM tries to allocate RAM it absolutely needs in order to function, this application should kill itself because another application is greedy? That sounds weird to me and very arbitrary. Suddenly your whole desktop crashes because Firefox is taking lots of RAM? Unreasonable.

              But your question is a good one; what is important and not? That depends on the situation. Caching and buffering is obviously very important when you have RAM for it, but might not be very important when you don't.

              Comment


              • #17
                Originally posted by ElectricPrism View Post
                Seems like aftermath of trying to save pennies on dollars. I think it's pretty obvious minimizing resource usage was never part of the original scope or goals of Gnome 3.x
                I suspect that the idea is for garbage collected languages: that signal can be used to trigger a garbage recollection, thus reducing memory usage as soon as memory becomes scarce, instead of waiting for other moments, when the system already sent a lot of blocks into swap, blocks that would have to be retrieved again to know if they have to be freed, causing more swap.

                Comment


                • #18
                  Originally posted by Candy View Post
                  GNOME never had a GLib!

                  glib, from my knowledge, has always been part (or pair) of (with) Gtk+ and GNOME depended on Gtk+.

                  So why do we change history and make glib and Gtk+ as something GNOME centric, when history tells us, that this has never been the case.

                  Gtk+ and glib has been invented because The GIMP (in its first couple of releases) depended on Motif. Later on The GIMP developers developed Gtk+ as free Toolkit for their imaging programm. Later on de Icaza and Co. build tools around the Gtk+ toolkit forming G.N.O.M.E. (yes with dots) around it.

                  Even if now Red Hat employees (and formerly GNOME centric developers) do work on Gtk3/4 still doesn't change the history and facts about Gtk+
                  1) It's no longer called GTK+, it's called GTK.
                  2) It's being maintained and primarily developed by the GNOME project and it's developers.
                  3) GIMP doesn't even use the latest version of the GTK.
                  4) GIMP doesn't set the direction or have much to do with the development of the GTK anymore.
                  5) GTK developers have started calling it the "GNOME Toolkit", which makes sense given its a toolkit developed by the GNOME project.

                  If you're going to complain about changing history, perhaps you should read up on it first.

                  Based on your logic, "Microsoft's Skype" would be wrong and "Skype Technologies Skype" would be right.
                  Last edited by Britoid; 18 December 2019, 07:32 AM.

                  Comment


                  • #19
                    Originally posted by jo-erlend View Post

                    I don't understand what you mean. There's no offending application. An application allocates and frees memory all the time, so if a highly efficient app that does not abuse RAM tries to allocate RAM it absolutely needs in order to function, this application should kill itself because another application is greedy?
                    OK, calling it offending would be incorrect. It is just an application system cannot allocate memory for.

                    An application allocates and frees memory all the time, so if a highly efficient app that does not abuse RAM tries to allocate RAM it absolutely needs in order to function, this application should kill itself because another application is greedy? That sounds weird to me and very arbitrary. Suddenly your whole desktop crashes because Firefox is taking lots of RAM? Unreasonable.
                    I'd agree that this is is unreasonable. It is also unreasonable to kill one application in place of the other. Yes there might preference order. But in that case should not OOM killer defer to desktop? Are any of current desktops even manages OOM killer hints?

                    Any way I'd argue that desktop critical functions should run on determenistic set of assumptions, including required minimum of memory which could be reserved for stable operation, so DE itself would not have out-of-memory situation, it is possible with pre-allocating necessary minimum and using some sort of internal allocator.

                    Comment


                    • #20
                      Back to topic: Wouldn't it be much easier to just setup a policy to prevent people installing chrome?


                      Comment

                      Working...
                      X