On 02-Sep-2013, at 11:27 AM, Ben Companjen wrote:

> Well, it is not live indexing - I know that slows things down a lot. :)
> It is the first time I used the MySQL Python connector and based my
> script on an example [1].
> I think the bottleneck may be calling commit() after each edition
> record. My hard disk is writing almost continuously. I also use one
> insert statement for each contributor, publisher, identifier, etc. I
> think dynamically creating insert statements containing all of these
> may outweigh the many database calls.

The best would be create a tsv file with all the data to be loaded into mysql 
and use "LOAD DATA INFILE 'data.txt' INTO TABLE table_name".

It might be even faster to split that file into smaller files of 100K  lines 
each load them one after other.

If you can show me your script, I can suggest improvements.

Anand
_______________________________________________
Ol-tech mailing list
[email protected]
http://mail.archive.org/cgi-bin/mailman/listinfo/ol-tech
To unsubscribe from this mailing list, send email to 
[email protected]

Reply via email to