Hi,

Is there any one here using DDL-Utils with significant volumes of data 
(data.xml of a few Mb) ?
My data.xml file is of 44 686 lines of data and 4 7734Kb of data.

I am exporting data from a PostgreSQL database to Derby database.
When inserting into a DB without any foreign keys, primary keys, indexes, it's 
fast (actually faster with DDL-Utils than with insert scripts running ij): a 
couple of minutes (40s for DDL-Utils).

But when running with keys and indexes, DDL-Utils takes 2 hours ! ! !

Running DB-Unit takes only 5, 6 minutes !

So I am back to my idea where it would nice to get the export to generate the 
data file according to an order, so that during import I do not have to use the 
ensureforeignkeyorder="true" option.
(see 
http://mail-archives.apache.org/mod_mbox/db-ddlutils-user/200701.mbox/[EMAIL 
PROTECTED] )
Ideally, DDL-Utils will generate the table orders it-self by reading the 
dependencies at the schema level (foreign keys). 
But at least, if I could give DDL-Utils the tables order: that will be great 
... otherwise I will have to stop using DDL-Utils for data: that I will be a 
shame !
 


Have fun,
[EMAIL PROTECTED]
The Computing Froggy




        

        
                
___________________________________________________________________________ 
Découvrez une nouvelle façon d'obtenir des réponses à toutes vos questions ! 
Profitez des connaissances, des opinions et des expériences des internautes sur 
Yahoo! Questions/Réponses 
http://fr.answers.yahoo.com

Reply via email to