Hi, I have an application whose output is about 500,000 pairs (string,
integer) -- this is the result of some fairly fancy text processing.
I'd like to put this data into a (new) Derby table. Using individual
Inserts for each row takes over an hour, which seems much too long.
Using the bulk import feature involves writing out to a file and then
importing from the file, which seems rather roundabout.

So... What is the recommended way to insert a large number of rows
from an application? Is the answer the same for 10^3 or 10^8 rows? Do
the data types involved (e.g. large text field with newlines) make any
difference to the answer?

Thanks,

         -s

Reply via email to