Rajes Akkineni wrote:
Hi,
I have got the similar problem.
I was unable to insert bigger files in to the database.
But my situation is little different than this. I am not inserting
total 40mb file in one blob.
I have split them in to 64kb chunks and inserting in multiple rows.
I got outofmemory error.
Are you using embedded driver or the network client driver ? I think
the issue discussed in this earlier thread is referring to the client
driver. ( -DERBY-326). But if you are noticing the problem with
embedded, it would be great if you could post a reproduction program,
so we can take a look.
I have tested inserting 1000 rows of 64kb blobs on my T40 laptop and it
works ok both with default jvm heap size and also with restricting the
jvm max heap size to 40mb. Note by default , for a table containing blob
column, page size is 32k and page cache is 1000 pages, so the pagecache
would take about 32mb of memory.
Now i seem to have another issue with the DERBY.
when i am Inserting, Select data using prepared statements(they are
all created once, and used from different different threads later)
some time i am not able to get the data which i have inserted. i
checked the insert operation. it is successfull but later when i
query from different thread it is not showing any rows in the table.
Are you running with autocommit off, if so might be good to check if a
commit was given after the insert or not.
Sunitha.