perhaps you can use a docker container instead which allows you to configure the open file limit?
For storing the lookup from label -> list of nodes Neo4j uses an index under the hood. Michael On Wed, Oct 5, 2016 at 5:24 AM, Mohammad Hossain Namaki <[email protected] > wrote: > Hi, > I've imported a huge dataset into Neo4j. 33M nodes and 144M > relationships. Thanks to Neo4j makers, "neo4j-importer" was very efficient. > However, I'm getting (Too many open files) errors. > > > ... > > Caused by: java.nio.file.FileSystemException: /fastscratch/mnamaki/ > idsForExp/idsAttackDB/schema/*label/lucene/labelStore/1*: *Too many open > files* > > > I've read some question/answer regarding this. However, I'm not the > administrator of the system that I run my java code on there. I'm using > Neo4j 3.0 using java API. > > So, I cannot increase the hard-limit of no-files which is 10240 in the > linux server that I have. > > > Is there any way that I can shut down this feature of Neo4j? I didn't > "index" anything explicitly and just used a command to import nodes and > relationships there. > > Or > > Is there any way that I can handle this without admin priviledge? > > > How can I understand how many number of files this dataset is required? > > > Thanks. > > -- > You received this message because you are subscribed to the Google Groups > "Neo4j" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to [email protected]. > For more options, visit https://groups.google.com/d/optout. > -- You received this message because you are subscribed to the Google Groups "Neo4j" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
