Try these 

http://jexp.de/blog/2017/03/5-tips-tricks-for-fast-batched-updates-of-graph-structures-with-neo4j-and-cypher/

Von meinem iPhone gesendet

> Am 07.06.2017 um 01:22 schrieb Matt Luongo <[email protected]>:
> 
> It's been nearly 5 years, so I know longer have this handy (and doubt it 
> would work with the latest Neo4j, regardless)- sorry!
> 
> --
> Matt Luongo
> Software Developer
> about.me/luongo
> 
>> On Tue, Jun 6, 2017 at 3:00 AM, Ralph <[email protected]> wrote:
>> Hi,
>> 
>> I need something that will load faster than REST bashing ..Please share the 
>> gist
>> 
>> 
>>> On Tuesday, July 31, 2012 at 8:24:53 PM UTC+5:30, Matt Luongo wrote:
>>> If the data is an initial import for a new database, I think the batch 
>>> inserter library is the way to go- otherwise Javier's client batching is a 
>>> great solution.
>>> 
>>> I've also used custom Gremlin over REST to keep the JSON serialization 
>>> overhead low- if you need something faster than REST batching, let me know 
>>> and I'll throw together a gist :)
>>> 
>>> - Matt
>>> 
>>>> On Tuesday, July 31, 2012 10:17:35 AM UTC-4, versae wrote:
>>>> Ups, the reading of CSV lines should be inside the 'with' block.
>>>> 
>>>>> On Tuesday, July 31, 2012, Javier de la Rosa wrote:
>>>>> One option could be use neo4j-rest-client and make transactions for
>>>>> nodes and relationships:
>>>>> 
>>>>> >>> import csv
>>>>> >>> from neo4jrestclient.client import GraphDatabase
>>>>> >>> gdb = GraphDatabase("http://localhost:7474/db/data/";)
>>>>> >>> nodes_reader = csv.reader(open('nodes.csv', 'rb'))
>>>>> >>> for prop1, prop2, prop3 in nodes_reader:
>>>>> ...     with gdb.transaction():
>>>>> ...         gdb.nodes.create(prop1=prop1, prop2=prop2, prop3=prop3)
>>>>> 
>>>>> And the same for relationships. Or you can even split the transactions
>>>>> in several if there are too many nodes.
>>>>> 
>>>>> On Tue, Jul 31, 2012 at 8:07 AM, Fedor Nikitin <[email protected]> 
>>>>> wrote:
>>>>> > Hi,
>>>>> >
>>>>> > I have big dataset and I want to insert it into clean neo4j database 
>>>>> > quite
>>>>> > fast.
>>>>> > So that user wait reasonable time to start working on database.
>>>>> > My dataset supposed to be stored as several CSV files (any other format 
>>>>> > is
>>>>> > also OK).
>>>>> >
>>>>> > Could someone recommends me the best way to do it?
>>>>> >
>>>>> > I am using Python+neo4j.
>>>>> >
>>>>> > I tried to load the data node by node, but it is not working for me.
>>>>> > It takes too long for my data to be loaded. I am looking for any way to
>>>>> > speed it up.
>>>>> >
>>>>> > Thanks,
>>>>> > Fedor
>>>>> >
>>>>> >
>>>>> >
>>>>> 
>>>>> 
>>>>> 
>>>>> --
>>>>> Javier de la Rosa
>>>>> http://versae.es
>>>> 
>>>> 
>>>> -- 
>>>> Javier de la Rosa
>>>> http://versae.es
>> 
>> -- 
>> You received this message because you are subscribed to a topic in the 
>> Google Groups "Neo4j" group.
>> To unsubscribe from this topic, visit 
>> https://groups.google.com/d/topic/neo4j/36HXPB8VKPk/unsubscribe.
>> To unsubscribe from this group and all its topics, send an email to 
>> [email protected].
>> For more options, visit https://groups.google.com/d/optout.
> 
> -- 
> You received this message because you are subscribed to the Google Groups 
> "Neo4j" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.

-- 
You received this message because you are subscribed to the Google Groups 
"Neo4j" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to