Re: [Neo4j] OutOfMemory while populating large graph

2010-07-10 Thread Mattias Persson
Great, so maybe neo4j-index should be updated to depend on Lucene 2.9.3. 2010/7/9 Bill Janssen jans...@parc.com Note that a couple of memory issues are fixed in Lucene 2.9.3. Leaking when indexing big docs, and indolent reclamation of space from the FieldCache. Bill Arijit Mukherjee

Re: [Neo4j] OutOfMemory while populating large graph

2010-07-09 Thread Mattias Persson
1:35 PM To: (User@lists.neo4j.org) Subject: [Neo4j] OutOfMemory while populating large graph I have seen people discuss committing transactions after some microbatch of a few hundred records, but I thought this was optional. I thought Neo4J would automatically write out to disk as memory

Re: [Neo4j] OutOfMemory while populating large graph

2010-07-09 Thread Marko Rodriguez
Hi, Would it actually be worth something to be able to begin a transaction which auto-committs stuff every X write operation, like a batch inserter mode which can be used in normal EmbeddedGraphDatabase? Kind of like: graphDb.beginTx( Mode.BATCH_INSERT ) ...so that you can start such

Re: [Neo4j] OutOfMemory while populating large graph

2010-07-09 Thread Arijit Mukherjee
I've a similar problem. Although I'm not going out of memory yet, I can see the heap constantly growing, and JProfiler says most of it is due to the Lucene indexing. And even if I do the commit after every X transactions, once the population is finished, the final commit is done, and the graph db

Re: [Neo4j] OutOfMemory while populating large graph

2010-07-09 Thread Rick Bullotta
), a time (each 30 seconds), or on a memory usage rule. -Original Message- From: user-boun...@lists.neo4j.org [mailto:user-boun...@lists.neo4j.org] On Behalf Of Mattias Persson Sent: Friday, July 09, 2010 7:30 AM To: Neo4j user discussions Subject: Re: [Neo4j] OutOfMemory while populating large

Re: [Neo4j] OutOfMemory while populating large graph

2010-07-09 Thread Paul A. Jackson
09, 2010 7:30 AM To: Neo4j user discussions Subject: Re: [Neo4j] OutOfMemory while populating large graph 2010/7/9 Marko Rodriguez okramma...@gmail.com Hi, Would it actually be worth something to be able to begin a transaction which auto-committs stuff every X write operation, like a batch

Re: [Neo4j] OutOfMemory while populating large graph

2010-07-09 Thread Bill Janssen
Note that a couple of memory issues are fixed in Lucene 2.9.3. Leaking when indexing big docs, and indolent reclamation of space from the FieldCache. Bill Arijit Mukherjee ariji...@gmail.com wrote: I've a similar problem. Although I'm not going out of memory yet, I can see the heap

[Neo4j] OutOfMemory while populating large graph

2010-07-08 Thread Paul A. Jackson
I have seen people discuss committing transactions after some microbatch of a few hundred records, but I thought this was optional. I thought Neo4J would automatically write out to disk as memory became full. Well, I encountered an OOM and want to make sure that I understand the reason. Was

Re: [Neo4j] OutOfMemory while populating large graph

2010-07-08 Thread Rick Bullotta
A. Jackson Sent: Thursday, July 08, 2010 1:35 PM To: (User@lists.neo4j.org) Subject: [Neo4j] OutOfMemory while populating large graph I have seen people discuss committing transactions after some microbatch of a few hundred records, but I thought this was optional. I thought Neo4J would automatically write