On Wed, May 21, 2008 at 10:24 AM, Lubomir Sterba <[EMAIL PROTECTED]> wrote: > Hello, > as I write in previous mail, we try to export relatively bigger database > with DdlUtils and migrate it from Oracle to PostgresSQL. After some > problems with ant tasks and schema definition (which are probably given by > bug in Oracle JDBC driver I think) we use DdlUtils Java API end export table > data table per table. After that quick operation we try to import data to > Postgres using Ddl API DataReader. And that was a big problem, table with > cca 1M records dont import after cca 8 hours. And this situation force us > after profiling DdwlUtils code where cca 50% of CPU time was spend in > commons-digester code to write own SAX based import. It is now now in > state can be published (converting datatypes written only for our purposes, > etc), bud 1M records inserts in two minutes opposite many hours with > digester API used by DataReader (we drop constraints and indexes and after > data load we will build them with Ddlutils). DdlUtils are superlative in > data structure manipulation, but with data manipulation I think there are > some lack for bigger databases (cca 10GB gzipped in DdlUtils .xml). But > there are many workarounds to use this tool and evaluate it very usefull. > Many thanks to authors > Lubos
Feel free to file an issue in DdlUtils' JIRA and attach a patch there, I'd be happy to look at it and incorporate your performance improvements into DdlUtils. Tom
