Hello,
I have huge dataset around 1E7 objects with references and
inheritance. Problem is how to import it to database in reasonable
time.

I was using classical EntityManager.persist(object) with commit every
10000 records. OpenJPA is doing great, SQL queries are executed very
fast.

Problem is database commit. On Derby it have speed around 2000
obj/second. Mysql with MyISAM is better 5000 obj/second but still too
slow.

There can be some optimalization, but I am not sure howto do it with OpenJPA:

1) create all indexes AFTER data import. Currently I am using
openjpa.jdbc.SynchronizeMappings property to create my schema. Is
there an way howto create schema without indexes?

2) using CSV files for import. Can OpenJPA export data to files and
then let database to do batch import?

3) using my own CSV files. But I would prefer not to lose DB type
independece. Also keeping dependencies generated by JPA can be
problem...



Thanks for advices
Jan Kotek

Reply via email to