I'm using embedded derby in a Java SE application, however I use Derby as a network server while building the database. In that connection I ran into problems. I have a database with 65000 rows that I wanted to add to a Derby database (total size before insert around 10000 records, 30MB). Each row consist of some strings and some floats (6 fields in total) and the operation requires two inserts and one select. During insert I run Derby as a Derby network server, the problem is that no matter what I do I get OutOfMemory exceptions (Derby ran out of heap space) all the time. Setting the max memory use of the JVM to 1GB allowed me to add around 17000 rows, but not anymore than that.
The problem was "solved" by running Derby embedded in the program that inserted data into the database and only commit every 50th row. When running Derby embedded I didn't run out of memory. After adding all 65000 records I can still run Derby as network server and do queries, but there are problems when trying to insert data. There are no problems when running Derby embedded. Have anybody experienced the same problem/features? I also experience a decrease in performance after having inserted many rows, but the slow-down is not as bad as described in the thread "exponential increase in insert time". Apart of the problems when inserting data in bulk, embedded Derby works great in my application. Thanks - Ture
