Announcement

Collapse
No announcement yet.

Firefox Enables Generational GC To Compete With Chrome

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Firefox Enables Generational GC To Compete With Chrome

    Phoronix: Firefox Enables Generational GC To Compete With Chrome

    The latest Mozilla Firefox nightly builds have begun enabling the generational garbage collector to better compete with Google's Chrome on performance grounds...

    http://www.phoronix.com/vr.php?view=MTY0Nzc

  • #2
    I have 10 tabs open in firefox ... 1Gb. That is obscene the pages aren't that large either Netsurf can load any one of them (except facebook since they have sshl issues) and never cross 30Mb. Thats 300Mb if I am generous for the HTML and image data... is javascript eating up 700Mb? The fact is Netsurf was able to load some of the pages in less than 5Mb from a fresh start the rest is just cache.

    Comment


    • #3
      Originally posted by cb88 View Post
      I have 10 tabs open in firefox ... 1Gb. That is obscene the pages aren't that large either Netsurf can load any one of them (except facebook since they have sshl issues) and never cross 30Mb. Thats 300Mb if I am generous for the HTML and image data... is javascript eating up 700Mb? The fact is Netsurf was able to load some of the pages in less than 5Mb from a fresh start the rest is just cache.
      Curious, but what addons/extensions are you using? Mozilla has often stated that some of its worst memory consumption/leak offenders were poorly coded add-ons. I'm sure some of the blame can be laid at Mozilla's feet, but it might not be all them.

      Comment


      • #4
        FYI, the current GC in SpiderMonkey is an incremental GC using Mark and Sweep - it's equivalent to having a single generation. I'd expect that they are still using the incremental GC for each of the generational buckets in their new GC algorithm.

        Reference: https://developer.mozilla.org/en-US/...age_collection

        Comment


        • #5
          For reference, I've got 10 tabs open as well, and I'm sitting under 300MB memory usage currently. That's still more than I like, but it's nowhere near the 1GB you've got.

          Currently enabled extensions:
          Firebug (enabled but not active on any pages)
          Garmin Communicator (to sync my watch's GPS recordings for biking/running)

          Comment


          • #6
            Originally posted by cb88 View Post
            I have 10 tabs open in firefox ... 1Gb. That is obscene the pages aren't that large either Netsurf can load any one of them (except facebook since they have sshl issues) and never cross 30Mb. Thats 300Mb if I am generous for the HTML and image data... is javascript eating up 700Mb? The fact is Netsurf was able to load some of the pages in less than 5Mb from a fresh start the rest is just cache.
            It really depends on the types of webpages. even simplistic webpages can kill your computer if they're of the infinite-scrolling variety. Google came up with a pretty cool concept of a paginated version of infinite scrolling. It's a shame they don't use it for G+ because that's the perfect demonstration of the drawbacks of infinite-scrolling pages.

            Comment


            • #7
              Adblock which netsurf also has builtin via css blocking... and DownloadHelper so one addon which "counts".

              Comment


              • #8
                I have 310 tabs open and 2,6 GiB memory usage, I find that pretty good.

                Comment


                • #9
                  Disabled DownloadHelper... 260Mb after reloading the session. 600Mb after reloading all the tabs I expect it would jump back to 1Gb if I left it running for any length of time though. That said the other memory figures are still excessive even before reloading and such.

                  Comment


                  • #10
                    Eating lots of memory?
                    1) HTML/JS/CSS were designed with little more than humble web pages in mind.
                    2) JS requires 2-10 times more memory than C/C++, cause it's interpreted and cause it doesn't allow saving memory, like no types, any number is internally of type double (8 bytes) no matter what.
                    3) There's much back-n-forth stuff going behind the scenes, lots of images typically being scaled/cached.

                    The extensions/plugins do add up too.

                    Comment


                    • #11
                      I've heard that one of the "secrets" of the Google V8 engine speed is the typing of the JS vars, from double to the type the jit guesses when a variable take some value. That's because normally a program won't use an already defined variable as another type of variable. Do you know if Firefox uses a similar behavior in its JS compiler?

                      Comment


                      • #12
                        Originally posted by kertoxol View Post
                        I've heard that one of the "secrets" of the Google V8 engine speed is the typing of the JS vars, from double to the type the jit guesses when a variable take some value. That's because normally a program won't use an already defined variable as another type of variable. Do you know if Firefox uses a similar behavior in its JS compiler?
                        Yes, they've had that for years: https://wiki.mozilla.org/TypeInference

                        Comment


                        • #13
                          wow, this is huge. It's the culmination of years of work, and I'm happy to see it finally enabled by default. Hopefully it will stay this way, and reach stable with Firefox 31.

                          For anyone curious, this is (was) the second most important bug on the memshrink bug list: https://blog.mozilla.org/nnethercote...-2nd-birthday/
                          And for anyone wondering why this is important (from https://blog.mozilla.org/nnethercote...nsumption-2/):
                          • Performance improves for three reasons. First, paging is reduced because of the generational behaviour: much of the JS engine activity occurs in the nursery, which is small; in other words, the memory activity is concentrated within a smaller part of the working set. Second, paging is further reduced because of the compaction: this reduces fragmentation within pages in the tenured heap, reducing the total working set size. Third, the tenured heap grows more slowly because of the generational behaviour: many objects are collected earlier (in the nursery) than they would be with a non-generational collector, which means that structure traversals done by the garbage collector (during full-heap collections) and cycle collector are faster.
                          • Virtual memory consumption drops in two ways. First, the compaction minimizes waste due to fragmentation. Second, the heap grows more slowly.
                          • Physical memory consumption drops for the same two reasons.
                          • Private bytes also drops for the same two reasons.
                          By memory, now the next big thing should be e10s

                          Comment


                          • #14
                            Originally posted by kertoxol View Post
                            I've heard that one of the "secrets" of the Google V8 engine speed is the typing of the JS vars, from double to the type the jit guesses when a variable take some value. That's because normally a program won't use an already defined variable as another type of variable. Do you know if Firefox uses a similar behavior in its JS compiler?
                            Firefox has had this behaviour for a long time now in different incarnations: TraceMonkey from Firefox 3, JaegerMonkey and more recently IonMonkey.

                            JS types are not explicitly typed (double or otherwise). Therefore, it can be a string, integer, double, etc. When the code is first run, it is interpreted (the JavaScript engine performs the instruction specified). Part of this process in modern engines is to record information on what the types are.

                            Knowing the type information, the JS engine can use that to say e.g. "this variable is an integer". Therefore, it can convert the code to assembly instructions for maximum performance (with guards if the variable type changes). This is the process of JIT compiling the code. Other languages like Java and C# have static types so can more predictably JIT compile code and do so up-front.

                            Comment


                            • #15
                              Firefox [Iceseasel on Debian specifically] 28 consitently hangs and throttles the CPU [even with an FX-8350 and the app not spreading processes across all available cores] over 108% due to heavy javascripted sites which have auto-updates a lot.

                              For instance, Have the Yahoo Sports with March madness in one tab, have two Huffingtonpost Tabs open, with the top level of either major subsection [Root level reloads constantly, Politics, Media, Tech, etc., also reload updates routinely every 30 seconds] and you've got an app who comes to a completely frozen experience.

                              Fix that and I'll be impressed.

                              Comment

                              Working...
                              X