On 03/24/2015 04:44 PM, Kyle Huey wrote:
That likely means that we've regressed the amount of memory the "system" requires. Revisiting the thresholds might be appropriate if those regressions are in the base layer (in other words, if L uses more memory than the kitkat base) but if those regressions are in Gecko/Gaia we should just fix them.
Indeed. But there's cost/benefit to investigating various things, and I think the question I forgot to ask is whether it makes sense for us to incur the overhead for v3.0 *right now* when v3.0 is such a nebulous thing and Gecko, the JS engine, and all the apps are likely to undergo significant changes before we get anywhere near the branch point.
The big risk, of course, is that we become sloppy and everything bloats and then the eventual v3.0 is a nightmare of us jumping up and down on the suitcase until it closes. The counterpoint to that is that the metrics/automation are improving greatly right now so we won't be blind going into v3.0.
Recognizing the trade-off, we could do something like bump 319M up a small amount to 339M to give headroom for a short time. Much like one might temporarily loosen their belt during the holidays or an all-you-can-eat cruise.
Andrew _______________________________________________ dev-b2g mailing list [email protected] https://lists.mozilla.org/listinfo/dev-b2g
