The result of distributing application and database over two machines is:
Import of the critical data now works, no hanging at 1200 records
(I need to start derby with -Xmx1024m else an outofmemory exception
is thrown). My import algorithm has to be reviewed, it allocates
memory in a loop, 100mb when looping 1300 times. The original ordering
problem does not appear at this point.
Now the less good news. After work, I turned off the computer where
derby is installed. I know derby expects a shutdown, but I just
pressed CTL-C in its terminal window and made a shutdown for the os.
This morning I booted the computer and started derby. I was surprized
about the memory consumption in the task list. The derby process was
consuming the complete memory, 1024mb, resident part 30mb (os is
suse9.3). I tried an import of another spreadsheet with 1220 records
in this situation. The import worked. I displayed the newly imported
data in my application, the data are listed slowly - and the
original ordering problem appears. To check if the composite index on
the sorting columns was corrupted I dropped and recreated the index.
Same ordering problem. Perhaps the cancelling of the derby server in
the evening caused the problem, I will redo the whole thing without
stopping the server.