Michael Dale wrote:
> *snip*
>> Yes, it's been filed before and WONTFIXed because parsing dozens or 
>> hundreds of pages in one request is kind of scary performance-wise
> 
> but clearly it would be more resource efficient than issuing 30 separate 
> additional requests... 

The API could process up to X work (time, includes, proprocessor
nodes...) and then issue some continue parameter.

Shouldn't be so bad if using the parser cache.

_______________________________________________
Mediawiki-api mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api

Reply via email to