Recently I have came across the situation where I need to bulk import data 
from csv file size of 10 MiB (with arangoJS on NodeJS project)... 

And Works fine while using Bulk Import..
But My colic said don't import that much data once...

Because that will affect process/performance negatively..

So He said "Do Import 100 records after 100 records"...

So I want to know is there any safer limit is there for Bulk Import or I 
can use it to import as much data as i can..

And If 100 after 100 is the safest way.. then there is any easy way to do 
this on array of docs with AQL

-- 
You received this message because you are subscribed to the Google Groups 
"ArangoDB" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to arangodb+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to