Lucas_Werkmeister_WMDE added a comment.

  I reset my wiki and repeated the import with the job run rate set to 150, and 
there were still 3003 jobs in the queue after the import was done. Manual page 
views still seem to chip away at the job queue, in pretty large chunks (e.g. 80 
at once), but API requests don’t.
  
  And I think the reason for this is – if I’m reading `MediaWiki.php` and 
`api.php` correctly – is that **API requests don’t run jobs.** The only 
`MediaWiki` method which `api.php` calls is `doPostOutputShutdown()`; it does 
//not// call `main()` or `schedulePostSendJobs()`.
  
  My assumption was that, with a sufficiently high job run rate, you could run 
large Wikibase imports, and each edit API request would on average run as many 
jobs as it enqueued, keeping the job queue within reasonable limits. But that’s 
clearly not the case: `$wgJobRunRate` is for regular page views (`index.php`), 
and any wiki which doesn’t have a sufficiently high volume of page views will 
just accumulate more and more jobs. “Write-only” or “API-only” (or “-mainly”) 
wikis must have a dedicated job runner.

TASK DETAIL
  https://phabricator.wikimedia.org/T255259

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Lucas_Werkmeister_WMDE
Cc: Lucas_Werkmeister_WMDE, danshick-wmde, Samantha_Alipio_WMDE, Addshore, 
Aklapper, Invadibot, maantietaja, Akuckartz, Iflorez, alaa_wmde, Nandana, Lahi, 
Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, 
Wikidata-bugs, aude, Lydia_Pintscher, Mbch331
_______________________________________________
Wikidata-bugs mailing list -- [email protected]
To unsubscribe send an email to [email protected]

Reply via email to