Given the amount of grief (startup/shutdown delays) that deleting
corrupted HTTP caches is causing us, I wonder if we should consider
shrinking the max size from 1 GB to something smaller so that
deletions have a smaller upper time bound.

It's quite possible that the oldest entries in such a large cache are
fairly stale and that the working set (let's say the set of URIs that
a user has loaded from cache in the last few days) is smaller than 1
GB.

I also have { null } data to back up that assertion :)  and it would
be good to have some.

Bjarne/Michal/Nick: do we have any way to measure how recently used
items in the cache are, or to see them in some order by LRU?
about:cache doesn't provide the info in the entry listings.

I could swear that Chrome was limiting their max size cache well below
their advertised limit of 1 GB for undescribed "performance reasons"
as of about a year ago, but I can't seem to find it now in the current
Chromium tree and can't find the reference to it in Bugzilla.
Anyway, they need to blow away their cache less than we do right now,
so it's more pressing for us.
_______________________________________________
dev-tech-network mailing list
[email protected]
https://lists.mozilla.org/listinfo/dev-tech-network

Reply via email to