https://bugzilla.wikimedia.org/show_bug.cgi?id=13209





--- Comment #24 from Gurch <[email protected]>  2009-01-13 
02:00:51 UTC ---
(In reply to comment #22)
> If the diffs *are* in fact cached, then fetching a lot at once is cheap. If
> they're not, I really don't want a single request to be able to multiply an
> expensive operation times 1500...
> 
> My general preference would be to limit to one-at-a-time fetching.
> 
> Perhaps if there's a strong use case for it, multiple fetches could be
> conditionally enabled, or could fetch "only if cached", but I'm not sure how
> valuable this is.

For me, multiple fetches would be useful, but I would only be using small
values of "multiple".

How about not granting a higher limit to users with the 'apihighlimits' right
for this particular request, and/or reducing the limit on the number of
revisions? If the revision limit was lowered to, say, 10 (with no higher limit
for bots), that would mean a maximum of 30 diffs generated, which may or may
not be still too high but is certainly lower than 1500 :)

Fetching only if cached wouldn't be much use in my case (as it would be mostly
used with suspicious recent changes which are inevitably going to be requested
by someone at some time but may not have been yet).

It's not vital, just useful to have.


-- 
Configure bugmail: https://bugzilla.wikimedia.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are on the CC list for the bug.

_______________________________________________
Wikibugs-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to