https://bugzilla.wikimedia.org/show_bug.cgi?id=64056

--- Comment #4 from Tisza GergÅ‘ <[email protected]> ---
What is the actual bug here: that parsing a page with hundreds of remote images
is slow, or that this can be used to DOS the site? The first sounds like a
"don't do that then" bug. The parser needs certain image attributes to generate
the HTML code, MediaWiki needs to make an API call to Commons to get that
information; the calls stack up when there are lots of images. It might be
possible to batch them somehow, or render the page with placeholders, put the
API calls in a job queue and reparse the page when they have finished; both
look like complex changes for relatively little benefit.

As for the DOS part, maybe there could be a limit of remote links per page that
the wiki operator can set?

-- 
You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
_______________________________________________
Wikibugs-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to