Here are some ideas: - Use a single prepared statement. Derby generates a new Java class for each sql statement, which is costly, so using a single statement is a lot faster. - Use a single transaction (disable auto-commit, and commit just once when you are done inserting). - Use batch inserts. Not sure if this makes a difference in Derby. - Create indices for the table after (not before) inserting data.
Jim > -----Original Message----- > From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Stavros > Macrakis > Sent: Wednesday, December 12, 2007 11:06 AM > To: [email protected] > Subject: Efficient loading of calculated data > > Hi, I have an application whose output is about 500,000 pairs (string, > integer) -- this is the result of some fairly fancy text processing. > I'd like to put this data into a (new) Derby table. Using individual > Inserts for each row takes over an hour, which seems much too long. > Using the bulk import feature involves writing out to a file and then > importing from the file, which seems rather roundabout. > > So... What is the recommended way to insert a large number of rows > from an application? Is the answer the same for 10^3 or 10^8 rows? Do > the data types involved (e.g. large text field with newlines) make any > difference to the answer? > > Thanks, > > -s
