Thank you, Emijrp! What about the dump of Commons images? [for those with 10TB to spare]
SJ On Sun, Jun 26, 2011 at 8:53 AM, emijrp <[email protected]> wrote: > Hi all; > > Can you imagine a day when Wikipedia is added to this list?[1] > > WikiTeam have developed a script[2] to download all the Wikipedia dumps (and > her sister projects) from dumps.wikimedia.org. It sorts in folders and > checks md5sum. It only works on Linux (it uses wget). > > You will need about 100GB to download all the 7z files. > > Save our memory. > > Regards, > emijrp > > [1] http://en.wikipedia.org/wiki/Destruction_of_libraries > [2] > http://code.google.com/p/wikiteam/source/browse/trunk/wikipediadownloader.py > > _______________________________________________ > Wiki-research-l mailing list > [email protected] > https://lists.wikimedia.org/mailman/listinfo/wiki-research-l > > -- Samuel Klein identi.ca:sj w:user:sj +1 617 529 4266 _______________________________________________ foundation-l mailing list [email protected] Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
