On Sun, Sep 1, 2013 at 5:42 PM, Ben Companjen <[email protected]>wrote:
> > I created a Python script that reads a dump file and puts the edition > records in a MySQL database. > > It works (when you manually create the tables), but it's very slow: > 10000 records in about an hour, which means all editions will take > about 10 days of continuous operation. > > Does anybody have a faster way? Is there some script for this in the > repository? > Are you creating the SQL by hand? You probably want to be emulating the format used by the MySQL dump utility. That'll make sure it gets loaded in quickly. In particular, I suspect it's probably doing live indexing. What you want to do is load all the data and then index at the end rather than indexing as you go. Tom
_______________________________________________ Ol-tech mailing list [email protected] http://mail.archive.org/cgi-bin/mailman/listinfo/ol-tech To unsubscribe from this mailing list, send email to [email protected]
