Merlijn van Deen <[email protected]> wrote:
>> Personally, I rather we wait for the Pywikipedia devs to fix that script,
> This is not going to happen anytime soon. Considering the state of the
> code base (two hundred exceptions for three hunderd wikis, long
> functions and no automated testing - and thus practically untestable),
> and the state of the InterLanguage extension ('will be installed
> soon'), so-one is really willing to invest a lot of time in tracking
> memory usage and reducing it.
> The only reasonable action we can take to reduce the memory
> consumption is to let the OS do its job in freeing memory: using one
> process to track pages that have to be corrected (using the database,
> if possible), and one process to do the actual fixing (interwiki.py).
> This should be reasonably easy to implement (i.e. use a pywikibot page
> generator to generate a list of pages, use a database layer to track
> interlanguage links and popen('interwiki.py <page>') if this is a
> fixable situation)
We could also move the pressure: Labs' bot running infra-
structure doesn't seem to be /that/ far from opening. If
interwiki bots were running there, it would allow the foun-
dation to judge whether pushing for the deployment of Inter-
Language isn't worth it in the end.
Meanwhile I think DaB.'s proposal is very adequate.
Tim
_______________________________________________
Toolserver-l mailing list ([email protected])
https://lists.wikimedia.org/mailman/listinfo/toolserver-l
Posting guidelines for this list:
https://wiki.toolserver.org/view/Mailing_list_etiquette