Re: [Neo4j] Memory overflow while creating big graph

2011-08-23 Thread Mattias Persson
Could you just quickly look at where most time is spent when it's slowing down? Just start VisualVM, attach to the process and monitor CPU 2011/8/16 Jose Vinicius Pimenta Coletto jvcole...@gmail.com Hi, I made some changes to use the BatchInserter to generate the initial database. The

[Neo4j] Memory overflow while creating big graph

2011-08-16 Thread Jose Vinicius Pimenta Coletto
Hi, I made some changes to use the BatchInserter to generate the initial database. The strategy is to identify all nodes that must be inserted and after doing this I create the edges. But I still having problems, after inserting 9M of nodes the running is very slow and not reach the edges

Re: [Neo4j] Memory overflow while creating big graph

2011-08-16 Thread Anders Nawroth
Hi Jose! The mailinglist removed your attachement, could you just paste the code into the mail instead? /anders 2011-08-16 22:55, Jose Vinicius Pimenta Coletto skrev: Hi, I made some changes to use the BatchInserter to generate the initial database. The strategy is to identify all nodes

[Neo4j] Memory overflow while creating big graph

2011-08-16 Thread Jose Vinicius Pimenta Coletto
Sorry, the source code follows: public class InitialDBCreator { private static final SimpleDateFormat DATE_PARSER = new SimpleDateFormat(dd/MM/); private static final SimpleDateFormat DATE_FORMATTER = new SimpleDateFormat(MMdd); private static final int GRP_DEST_DOC = 1;

Re: [Neo4j] Memory overflow while creating big graph

2011-08-16 Thread Peter Neubauer
Joe, Do you have access to a profiler like Visual VM? It could be that the Regexp is not scaling - i have seen this in my SQL importer project. Just a thought, would be great if you can measure where the slowdown occurs. /peter Sent from my phone. On Aug 16, 2011 11:09 PM, Jose Vinicius Pimenta

[Neo4j] Memory overflow while creating big graph

2011-08-03 Thread Jose Vinicius Pimenta Coletto
Hi, I'm trying to create a graph with 15M nodes and 12M relationships, but after insert 400K relationships the following exception is thrown: Exception in thread main java.lang.OutOfMemoryError: GC overhead limit exceeded. I'm using -Xmx3g and the following configuration file for the graph:

Re: [Neo4j] Memory overflow while creating big graph

2011-08-03 Thread Niels Hoogeveen
Is it possible for you to use the batch inserter, or does the data you are loading require a lot of lookups? Niels From: jvcole...@gmail.com Date: Wed, 3 Aug 2011 17:57:20 -0300 To: user@lists.neo4j.org Subject: [Neo4j] Memory overflow while creating big graph Hi, I'm trying to create

Re: [Neo4j] Memory overflow while creating big graph

2011-08-03 Thread Michael Hunger
Do you commit your transaction in batches (e.g. every 10k nodes)? How much memory does your JVM get? e.g. via. -Xmx2G Cheers Michael Am 03.08.2011 um 22:57 schrieb Jose Vinicius Pimenta Coletto: Hi, I'm trying to create a graph with 15M nodes and 12M relationships, but after insert 400K

[Neo4j] Memory overflow while creating big graph

2011-08-03 Thread Jose Vinicius Pimenta Coletto
Niels, before creating any node or relationship I check if some of them already exist in the index, I can use the BatchInserter doing this? Michael, I'm closing my Transactions every 5k inserts, but apparently this is not working. I'm running the JVM with -Xmx3g. -- Thanks, Jose Vinicius

Re: [Neo4j] Memory overflow while creating big graph

2011-08-03 Thread Michael Hunger
Jose, can you provide the full stack trace of the OOM ? and probably show some of the source code you use to try to reproduce it. How much physical RAM does the machine have? Can you show us the configuration dump at the last startup of graph.db/messages.log ? Cheers Michael for the batch