Is it possible for you to use the batch inserter, or does the data you are loading require a lot of lookups? Niels
> From: [email protected] > Date: Wed, 3 Aug 2011 17:57:20 -0300 > To: [email protected] > Subject: [Neo4j] Memory overflow while creating big graph > > Hi, > > I'm trying to create a graph with 15M nodes and 12M relationships, but after > insert 400K relationships the following exception is thrown: "Exception in > thread" main "java.lang.OutOfMemoryError: GC overhead limit exceeded". > > I'm using -Xmx3g and the following configuration file for the graph: > neostore.nodestore.db.mapped_memory = 256M > neostore.relationshipstore.db.mapped_memory = 1G > neostore.propertystore.db.mapped_memory = 90M > neostore.propertystore.db.index.mapped_memory = 1M > neostore.propertystore.db.index.keys.mapped_memory = 1M > neostore.propertystore.db.strings.mapped_memory = 768M > neostore.propertystore.db.arrays.mapped_memory = 130M > cache_type = weak > > Can anyone help me? > > -- > Jose Vinicius Pimenta Coletto > _______________________________________________ > Neo4j mailing list > [email protected] > https://lists.neo4j.org/mailman/listinfo/user _______________________________________________ Neo4j mailing list [email protected] https://lists.neo4j.org/mailman/listinfo/user

