daniel added a comment.
@aude: thanks for the benchmark! I'm surprised that loading the full entities
does not have a bigger impact on memory usage. How many different entities
where hit, and how big are these entities?
TASK DETAIL
https://phabricator.wikimedia.org/T74309
REPLY HANDLER
daniel added a comment.
@aude: memcached retrieval of entities is faster than sql queries of term
table -- i'm not sure this is true, especially for large items. I have sent
mail to Springle and Ori asking for input. Will do some benchmarking.
For course, fre-fetching/batching plus in-process
daniel added a comment.
@aude: memcached retrieval of entities is faster than sql queries of term
table -- i'm not sure this is true, especially for large items. I have sent
mail to Springle and Ori asking for input. Will do some benchmarking.
For course, fre-fetching/batching plus in-process
aude added a comment.
only one data point comparison on my dev wiki and obviously can't reproduce all
production conditions, but:
recent changes with EntityTermLookup (30 days, 60 items) - TermSqlIndex:
65955512 memory
28786 backend response time
recent changes with EntityRetrievingTermLookup
gerritbot added a comment.
Change 176378 abandoned by Daniel Kinzler:
Introducing RecentChangesRowsForDisplay hook.
Reason:
ChangesListInitRows exists and should do
[[https://gerrit.wikimedia.org/r/176378]]
TASK DETAIL
https://phabricator.wikimedia.org/T74309
REPLY HANDLER ACTIONS
Reply
gerritbot added a comment.
Change 176378 abandoned by Daniel Kinzler:
Introducing RecentChangesRowsForDisplay hook.
Reason:
ChangesListInitRows exists and should do
[[https://gerrit.wikimedia.org/r/176378]]
TASK DETAIL
https://phabricator.wikimedia.org/T74309
REPLY HANDLER ACTIONS
Reply
daniel added a comment.
As Katie pointed out, the existing ChangesListInitRows hook should fit the bill
already.
The rest of the investigation should be to outline which services/interfaces we
need to make the pre-fetching work.
TASK DETAIL
https://phabricator.wikimedia.org/T74309
REPLY
daniel added a comment.
As Katie pointed out, the existing ChangesListInitRows hook should fit the bill
already.
The rest of the investigation should be to outline which services/interfaces we
need to make the pre-fetching work.
TASK DETAIL
https://phabricator.wikimedia.org/T74309
REPLY
daniel added a comment.
Here's a rough outline of the pre-fetching infrastructure for labels and other
Terms:
We need a TermCache service like this:
```
TermCache {
/**
* Update terms for the given entity. Any old terms associated with the
entity are discarded.
**/
public
daniel added a comment.
Here's a rough outline of the pre-fetching infrastructure for labels and other
Terms:
We need a TermCache service like this:
```
TermCache {
/**
* Update terms for the given entity. Any old terms associated with the
entity are discarded.
**/
public
daniel added a comment.
Jan and Thiemo brought up an idea for pre-fetching labels accessed via Lua or
{{#property}} with arbitrary access enabled: we pre-fetch based on the usage
tracking info, so that when parsing a new revision of a page, pre have fast
access to all labels/terms used by the
daniel added a comment.
Jan and Thiemo brought up an idea for pre-fetching labels accessed via Lua or
{{#property}} with arbitrary access enabled: we pre-fetch based on the usage
tracking info, so that when parsing a new revision of a page, pre have fast
access to all labels/terms used by the
aude added a comment.
if we use memcached, then suggest we cache entities that are in the recent
changes table (maybe x recent days) plus perhaps most used entities as
determined by client usage tracking.
TASK DETAIL
https://phabricator.wikimedia.org/T74309
REPLY HANDLER ACTIONS
Reply to
aude added a comment.
if we use memcached, then suggest we cache entities that are in the recent
changes table (maybe x recent days) plus perhaps most used entities as
determined by client usage tracking.
TASK DETAIL
https://phabricator.wikimedia.org/T74309
REPLY HANDLER ACTIONS
Reply to
14 matches
Mail list logo