Re: [Neo4j] performance when deleting large numbers of nodes

2016-06-18 Thread 'Michael Hunger' via Neo4j
Shouldn't be slow. Faster disk. Concurrent batches would help. Von meinem iPhone gesendet > Am 18.06.2016 um 22:29 schrieb John Fry : > > > Clark - this works. It is still slow. I guess multithreading may help > some > > > > Transaction tx = db.beginTx(); > > //try ( Tr

Re: [Neo4j] Null id on relationships

2016-06-18 Thread 'Michael Hunger' via Neo4j
How do you access the ids? With id(r) in cypher or r.getId() in Java ? Von meinem iPhone gesendet > Am 18.06.2016 um 20:25 schrieb John Fry : > > Hello All, > > what could be the cause of having relationships in a *.db with the id set as > null? > > When I crate the relationship, via the batc

Re: [Neo4j] performance when deleting large numbers of nodes

2016-06-18 Thread Clark Richey
Yes. That's a lot to delete doing it in parallel will definitely help Sent from my iPhone > On Jun 18, 2016, at 17:29, John Fry wrote: > > > Clark - this works. It is still slow. I guess multithreading may help > some > > > > Transaction tx = db.beginTx(); > > //try ( T

Re: [Neo4j] performance when deleting large numbers of nodes

2016-06-18 Thread John Fry
Clark - this works. It is still slow. I guess multithreading may help some Transaction tx = db.beginTx(); //try ( Transaction tx = db.beginTx() ) { for (int i=0; i5) { txc=0; tx.success(); tx.close(); tx = db.beginTx(); } }

Re: [Neo4j] performance when deleting large numbers of nodes

2016-06-18 Thread Clark Richey
Don't them. Just create a counter and every x deletes commit the transaction and open a new one. Sent from my iPhone > On Jun 18, 2016, at 17:03, John Fry wrote: > > Thanks Clark - is there any good/recommended way to nest the commits? > > Thx JF > >> On Saturday, June 18, 2016 at 1:43:19 P

Re: [Neo4j] performance when deleting large numbers of nodes

2016-06-18 Thread John Fry
Thanks Clark - is there any good/recommended way to nest the commits? Thx JF On Saturday, June 18, 2016 at 1:43:19 PM UTC-7, Clark Richey wrote: > > You need to periodically commit. Holding that many transactions in memory > isn't efficient. > > Sent from my iPhone > > On Jun 18, 2016, at 16:

Re: [Neo4j] performance when deleting large numbers of nodes

2016-06-18 Thread Clark Richey
You need to periodically commit. Holding that many transactions in memory isn't efficient. Sent from my iPhone > On Jun 18, 2016, at 16:41, John Fry wrote: > > Hello All, > > I have a graph of about 200M relationships and often I need to delete a > larges amount of them. > For the proxy c

[Neo4j] performance when deleting large numbers of nodes

2016-06-18 Thread John Fry
Hello All, I have a graph of about 200M relationships and often I need to delete a larges amount of them. For the proxy code below I am seeing huge memory usage and memory thrashing when deleting about 15M relationships. When it hits tx.close() I see all CPU cores start working at close to 100%

[Neo4j] Null id on relationships

2016-06-18 Thread John Fry
Hello All, what could be the cause of having relationships in a *.db with the id set as null? When I crate the relationship, via the batch inserter, I assume that it creates an ID - why wouldn't it? (I carefully check that src/start & dst/end nodes exist). When I later fetch relationships som