Thanks Anand, Bryan.

My guess was right: not committing every record reduces the load on the
harddisk and speeds things up, even though it still is pretty slow.

Anyway, I put the script in the OL Dump scripts collection:
https://github.com/bencomp/oldumpscripts/blob/master/dump2mysql.py
next to the existing dump2csv.py:
https://github.com/bencomp/oldumpscripts/blob/master/dump2csv.py

Neither convert everything in every record, but it's all copy/paste from
here.

Ben


On 1 September 2013 23:45, Bryan Fordham <[email protected]> wrote:

> If you want to post your script, perhaps someone can speed it up.
>
> Thanks,
> --B
>
> Sent from my phone. Please excuse any spelling or grammar errors.
> On Sep 1, 2013 5:42 PM, "Ben Companjen" <[email protected]> wrote:
>
>> Hi all,
>>
>> I created a Python script that reads a dump file and puts the edition
>> records in a MySQL database.
>>
>> It works (when you manually create the tables), but it's very slow:
>> 10000 records in about an hour, which means all editions will take
>> about 10 days of continuous operation.
>>
>> Does anybody have a faster way? Is there some script for this in the
>> repository?
>>
>> Regards,
>>
>> Ben
>> _______________________________________________
>> Ol-tech mailing list
>> [email protected]
>> http://mail.archive.org/cgi-bin/mailman/listinfo/ol-tech
>> To unsubscribe from this mailing list, send email to
>> [email protected]
>>
>
> _______________________________________________
> Ol-tech mailing list
> [email protected]
> http://mail.archive.org/cgi-bin/mailman/listinfo/ol-tech
> To unsubscribe from this mailing list, send email to
> [email protected]
>
>
_______________________________________________
Ol-tech mailing list
[email protected]
http://mail.archive.org/cgi-bin/mailman/listinfo/ol-tech
To unsubscribe from this mailing list, send email to 
[email protected]

Reply via email to