2009/8/12 Vincent MEURISSE <openmoko-commun...@meurisse.org>: > On Wednesday 12 August 2009 10:09:51 am Christian Reitwießner wrote: >> So the bottom line is: I don't know if the size can be reduced >> significantly. For Evopedia 2.0 I do the dumps myself (mainly because >> those on static.wikipedia.org are more than a year old), but it takes >> really long > Why not use dump DB from <http://download.wikimedia.org/backup-index.html> ? > The last dump for English version is 5GB but you will probably need to add an > index to have the file usable.
there's also this, which takes an xml dump as produced by wikipedia every two months or so, processes it in a few hours to produce an index, and will run anywhere that has php, perl and a couple of other generic utilities http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html i had it on my laptop a while back, it was very usable. no images though _______________________________________________ Openmoko community mailing list community@lists.openmoko.org http://lists.openmoko.org/mailman/listinfo/community