On Mon, May 25, 2009 at 6:18 AM, Christian Reitwießner
<christ...@reitwiessner.de> wrote:
> K. Peachey schrieb:
>> Instead of downloading the commons dump, have you considered using
>> $wgForeignFileRepos compared to the sumps, althought this would need a
>> web connection for it to work properly. They have a example for how to
>> set it up to work with files from commons:
>> http://www.mediawiki.org/wiki/Manual:%24wgForeignFileRepos#Using_files_from_Wikimedia_Commons_:_ForeignAPIRepo
>
> Thanks for the hint! Perhaps I should explain a bit more. I have
> modified the DumpHTML extension and want to use it to create a
> compressend wikipedia dump for offline use. This dump will not contain
> images but it would be good if it at least contained the links to the
> images so that they can be displayed if there is a connection to the
> internet. I think using commons as external file repository would solve
> all these problems.
>
> So if I use commons as external file repository, it will be queried for
> every file in wikipedia during the dump process, right? Is that
> acceptable for the wikimedia servers?
>
> Kind regards,
> Christian Reitwießner
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

Seeing as pages are parsed on import, it would process the images and hit
the commons repo for each page you're importing. I'm not the one to answer
as to whether this is acceptable or not though.

-Chad

_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to