[Neo4j] Re: Too many open files on a huge dataset with 110 labels

2016-10-16 Thread Mohammad Hossain Namaki
In total 110 distinct labels for all nodes. So, each label has a large 
number of corresponding nodes in datagraph.
The problem was originally from my code.

On Tuesday, October 4, 2016 at 8:24:52 PM UTC-7, Mohammad Hossain Namaki 
wrote:
>
> Hi,
> I've imported a huge dataset into Neo4j. 33M nodes and 144M 
> relationships. Thanks to Neo4j makers, "neo4j-importer" was very efficient. 
> However, I'm getting (Too many open files) errors. 
>
>
> ...
>
> Caused by: java.nio.file.FileSystemException: 
> /fastscratch/mnamaki/idsForExp/idsAttackDB/schema/
> *label/lucene/labelStore/1*: *Too many open files*
>
>
> I've read some question/answer regarding this. However, I'm not the 
> administrator of the system that I run my java code on there. I'm using 
> Neo4j 3.0 using java API. 
>
> So, I cannot increase the hard-limit of no-files which is 10240 in the 
> linux server that I have.
>
>
> Is there any way that I can shut down this feature of Neo4j? I didn't 
> "index" anything explicitly and just used a command to import nodes and 
> relationships there.
>
> Or
>
> Is there any way that I can handle this without admin priviledge?
>
>
> How can I understand how many number of files this dataset is required?
>
>
> Thanks.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Neo4j" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to neo4j+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[Neo4j] Re: Too many open files on a huge dataset with 110 labels

2016-10-16 Thread 'John Singer' via Neo4j
when you say you have 110 labels do you mean in total or do you have 
individual nodes with that many labels?  Not sure that this is the cause of 
your problem but you never know.

On Tuesday, October 4, 2016 at 10:24:52 PM UTC-5, Mohammad Hossain Namaki 
wrote:
>
> Hi,
> I've imported a huge dataset into Neo4j. 33M nodes and 144M 
> relationships. Thanks to Neo4j makers, "neo4j-importer" was very efficient. 
> However, I'm getting (Too many open files) errors. 
>
>
> ...
>
> Caused by: java.nio.file.FileSystemException: 
> /fastscratch/mnamaki/idsForExp/idsAttackDB/schema/
> *label/lucene/labelStore/1*: *Too many open files*
>
>
> I've read some question/answer regarding this. However, I'm not the 
> administrator of the system that I run my java code on there. I'm using 
> Neo4j 3.0 using java API. 
>
> So, I cannot increase the hard-limit of no-files which is 10240 in the 
> linux server that I have.
>
>
> Is there any way that I can shut down this feature of Neo4j? I didn't 
> "index" anything explicitly and just used a command to import nodes and 
> relationships there.
>
> Or
>
> Is there any way that I can handle this without admin priviledge?
>
>
> How can I understand how many number of files this dataset is required?
>
>
> Thanks.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Neo4j" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to neo4j+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [Neo4j] Re: Too many open files on a huge dataset with 110 labels

2016-10-07 Thread 'Michael Hunger' via Neo4j
No worries. I was already wondering :)

On Fri, Oct 7, 2016 at 8:40 AM, Mohammad Hossain Namaki  wrote:

> :(
> I'm so sorry. It seems that there was a bug in my code that didn't close
> open files in some cases. However, as I've seen this problem in forums I've
> thought that this is from Neo4j.
> I apologize for any inconvenience.
>
>
> On Thursday, October 6, 2016 at 11:16:39 PM UTC-7, Mohammad Hossain Namaki
> wrote:
>>
>> I've removed the label folder and run it again. I was watching the
>> folder. it first created a lot of files then removed all of them and
>> created just 14 items including write.lock and segments_1.
>> but again at this line, I've got the error.
>>
>> FileOutputStream fos = new FileOutputStream(fout);
>>
>> I'll handle this by partitioning the dataset to multiple ones. However, I
>> wanted to inform you about this problem that I've encountered.
>> I'm open to do anything for reproducing the problem.
>>
>> Thanks.
>>
> --
> You received this message because you are subscribed to the Google Groups
> "Neo4j" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to neo4j+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Neo4j" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to neo4j+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [Neo4j] Re: Too many open files on a huge dataset with 110 labels

2016-10-07 Thread 'Michael Hunger' via Neo4j
Also I'm not sure if the ulimit is per user or per process. So there might
also be a per-user limit.
Can you see if you have any other file-consuming apps open? Like an IDE or
such?

On Fri, Oct 7, 2016 at 8:37 AM, Michael Hunger <
michael.hun...@neotechnology.com> wrote:

> Are you sure you're closing your own files? Do you open any other files in
> your code?
>
> On Fri, Oct 7, 2016 at 8:16 AM, Mohammad Hossain Namaki <
> mhn.na...@gmail.com> wrote:
>
>> I've removed the label folder and run it again. I was watching the
>> folder. it first created a lot of files then removed all of them and
>> created just 14 items including write.lock and segments_1.
>> but again at this line, I've got the error.
>>
>> FileOutputStream fos = new FileOutputStream(fout);
>>
>> I'll handle this by partitioning the dataset to multiple ones. However, I
>> wanted to inform you about this problem that I've encountered.
>> I'm open to do anything for reproducing the problem.
>>
>> Thanks.
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "Neo4j" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to neo4j+unsubscr...@googlegroups.com.
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Neo4j" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to neo4j+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [Neo4j] Re: Too many open files on a huge dataset with 110 labels

2016-10-07 Thread 'Michael Hunger' via Neo4j
Are you sure you're closing your own files? Do you open any other files in
your code?

On Fri, Oct 7, 2016 at 8:16 AM, Mohammad Hossain Namaki  wrote:

> I've removed the label folder and run it again. I was watching the folder.
> it first created a lot of files then removed all of them and created just
> 14 items including write.lock and segments_1.
> but again at this line, I've got the error.
>
> FileOutputStream fos = new FileOutputStream(fout);
>
> I'll handle this by partitioning the dataset to multiple ones. However, I
> wanted to inform you about this problem that I've encountered.
> I'm open to do anything for reproducing the problem.
>
> Thanks.
>
> --
> You received this message because you are subscribed to the Google Groups
> "Neo4j" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to neo4j+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Neo4j" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to neo4j+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[Neo4j] Re: Too many open files on a huge dataset with 110 labels

2016-10-07 Thread Mohammad Hossain Namaki
I've removed the label folder and run it again. I was watching the folder. 
it first created a lot of files then removed all of them and created just 
14 items including write.lock and segments_1.
but again at this line, I've got the error.

FileOutputStream fos = new FileOutputStream(fout);

I'll handle this by partitioning the dataset to multiple ones. However, I 
wanted to inform you about this problem that I've encountered.
I'm open to do anything for reproducing the problem.

Thanks.

-- 
You received this message because you are subscribed to the Google Groups 
"Neo4j" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to neo4j+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.