Nemo_bis added a comment.
Uh. https://lists.wikimedia.org/pipermail/wikidata-l/2015-April/005993.html
TASK DETAIL
https://phabricator.wikimedia.org/T43345
REPLY HANDLER ACTIONS
Reply to comment or attach files, or !close, !claim, !unsubscribe or !assign
.
EMAIL PREFERENCES
https://phabr
Nemo_bis added a comment.
There are still Wikipedias with hundreds of thousands interwikis in wikitext,
according to https://stats.wikimedia.org/EN/TablesDatabaseWikiLinks.htm , so
there is indeed no shortage of work to do.
TASK DETAIL
https://phabricator.wikimedia.org/T43345
REPLY HANDLER
Betacommand added a comment.
That is actually the opposite reason. As more and more links get moved to
wikidata finding and resolving the remaining links becomes more and more the
primary focus.
TASK DETAIL
https://phabricator.wikimedia.org/T43345
REPLY HANDLER ACTIONS
Reply to comment or
Multichill added a subscriber: Multichill.
Multichill added a comment.
I think we can lower the priority of this bug. Scale isn't really an issue
because most language links have already been moved to Wikidata anyway.
TASK DETAIL
https://phabricator.wikimedia.org/T43345
REPLY HANDLER ACTIONS
Betacommand added a comment.
That works on the small scale, but doesn't scale.One key example of how this
could be used: Find all articles where language links are not in wikidata, Its
a fairly simple database query, or 2+ million queries to the API.
TASK DETAIL
https://phabricator.wikimedia
valhallasw added a subscriber: valhallasw.
valhallasw added a comment.
> But if bots would know that langlinks are already stored at wikidata they do
> not have to request source code of many local pages.
The langlinks that are stored on wikidata are available with a single request:
A)
https: