>On 7/11/18 5:42 AM, David Bruant wrote: >> I've seen this information of 100 content processes in a couple places but >> i haven't been able to find the rationale for it. How was the 100 number >> picked? > >I believe this is based on telemetry for number of distinct sites involved >in browsing sessions.
As an example, 10 randomly chosen tabs in Chrome site isolation (a few months ago) yielded ~80 renderers (Content processes). Some sites generate a lot; that list of 10 included some which likely don't generate more than 1 or 2: google.com, mozilla.org, facebook login page, wikipedia (might spawn a few?). >> Would 90 prevent a release of project fission? > >It would make it harder to ship to users, yes... Whether it "prevents" >would depend on other considerations. It's a continuum - the more memory we use, the more OOMs, the worse we'll look (relative to Chrome), the larger impact on system perf, etc. There's likely no hard line, but there may be a defined "we need to get at least here" line, and for now that's 100 apparently (I wasn't directly involved in picking it, so I don't know how "hard" it is). We'll have to do more than just limit process sizes, but limiting process sizes is basically table stakes, IMO. -- Randell Jesup, Mozilla Corp remove "news" for personal email _______________________________________________ dev-platform mailing list dev-platform@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-platform