Hello,

here i am trying to upload a massive network:
4M nodes, 100M correlations.

having problems of memory and perfomance, I'd like to know if I am doing it 
OK:

1.
Before loading the correlations, I wanted to load the nodes.

2. Set up neo4-wrapper and neo4j.properties as written in 
http://www.neo4j.org/graphgist?d788e117129c3730a042

with JVM heap set at 4096Mb

with this setting, bulk on 4M nodes failed.

3. Raised memory min-heap and max-heap to 6144Mb
Run a test with 100K nodes.

I got:
Nodes created: 98991
Properties set: 197982
Labels added: 98991
3438685 ms

Almost an hour for uploading 100K nodes with two properties?
I thought it should be much faster.

Am I doing smtg wrong?
this is the importer code I used:

CREATE CONSTRAINT ON (n:MYNODES) ASSERT n.id IS UNIQUE;
CREATE INDEX ON : n:MYNODES(name);

USING PERIODIC COMMIT 1000
LOAD CSV WITH HEADERS FROM 'file:///blablabla.csv' AS line  FIELDTERMINATOR 
'\t' 
WITH line, toInt(line.topicId) as id, line.name as name* LIMIT 100000*
MERGE (n:MYNODES { id: id, name: name });


-- 
You received this message because you are subscribed to the Google Groups 
"Neo4j" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to