I think import into an empty table with no indexes, using embedded
derby is the way to go. 2 million doesn't sound like a lot but of
course that depends on the size of records (2 million 2 gig records
would be a lot). Any estimate of row size? What kind of cpu and
what kind of disk is being used?
If possible I would run a test with a much smaller dataset, say
10,000 records to get an idea of the speed required. Make sure there
is not some format error in the file.
See more comments below:
Sedillo, Derek (Mission Systems) wrote:
Suresh,
There are no foreign keys or triggers on the tables. But there are
indexes which might be slowing things down. So I might drop the indexes
and try running import again.
Do you think that 2 million records are too many to use with the import
utility?
Thanks,
Derek
-----Original Message-----
From: Suresh Thalamati [mailto:[EMAIL PROTECTED]
Sent: Wednesday, March 14, 2007 4:00 PM
To: Derby Discussion
Subject: Re: Import data from large Oracle table
Sedillo, Derek (Mission Systems) wrote:
Hello,
I have several large Oracle tables (2+ million records) which I need
to import data from to Derby. Here is what I have tried so far:
1. I have dumped the data to comma separated flat files.
2. Used the import table utility like this:
/CALL SYSCS_UTIL.SYSCS_IMPORT_TABLE
(null,'SSJ_CNT','SSJ_CNT.csv',null, null,null,0);/
Import will run slow, if the table has triggers, foreing key
references or if the table already has some data. These force the data
to be logged , which makes import run slow.
If you have foreign keys/triggers you may want drop them and add them
back after the data is imported. If you do that import might run
faster.
3. After '4 hours' of running it appears to have frozen up.
What does frozen up mean? No CPU? No I/O? Did you monitor messages
in derby.log?
There is
both a db.lck and dbex.lck file present which I will delete soon.
What JVM are you using? With modern JVM's you should NEVER have to
delete these files by hand and can eliminate Derby's ability to prevent
db corruption by doing so.
These files are not related to import. You should not be deleting
them. They are used to prevent multiple jvm instances booting
the same database concurrenly.
Do you think that 2 million records is just too much for this utility
to
handle? Is there a better way to transfer data from another database
besides the IMPORT_TABLE utility?
I have not tried with 2 million rows , IMPORT_TABLE is the fastest way
to tranfer data from files into a derby database.
hope that helps
-suresh