Re: Too many open files, why changing ulimit not effecting?

2016-02-10 Thread Michael Diamant
If you are using systemd, you will need to specify the limit in the service
file.  I had run into this problem and discovered the solution from the
following references:
* https://bugzilla.redhat.com/show_bug.cgi?id=754285#c1
* http://serverfault.com/a/678861

On Fri, Feb 5, 2016 at 1:18 PM, Nirav Patel  wrote:

> For centos there's also /etc/security/limits.d/90-nproc.conf  that may
> need modifications.
>
> Services that you expect to use new limits needs to be restarted. Simple
> thing to do is to reboot the machine.
>
> On Fri, Feb 5, 2016 at 3:59 AM, Ted Yu  wrote:
>
>> bq. and *"session required pam_limits.so"*.
>>
>> What was the second file you modified ?
>>
>> Did you make the change on all the nodes ?
>>
>> Please see the verification step in
>> https://easyengine.io/tutorials/linux/increase-open-files-limit/
>>
>> On Fri, Feb 5, 2016 at 1:42 AM, Mohamed Nadjib MAMI > > wrote:
>>
>>> Hello all,
>>>
>>> I'm getting the famous *java.io.FileNotFoundException: ... (Too many
>>> open files) *exception. What seemed to have helped people out, it
>>> haven't for me. I tried to set the ulimit via the command line *"ulimit
>>> -n"*, then I tried to add the following lines to
>>> *"/etc/security/limits.conf"* file:
>>>
>>> ** - nofile 100*
>>> *root soft nofile 100*
>>> *root hard nofile 100*
>>> *hduser soft nofile 100*
>>> *hduser hard nofile 100*
>>>
>>> ...then I added this line *"session required pam_limits.so"* to the two
>>> files* "/etc/pam.d/common-session"* and *"session required
>>> pam_limits.so"*. The I logged-out/logged-in. First, I tried only the
>>> first line (** - nofile 100**)*, then added the 2nd and the 3rd
>>> (root...),  then added the last two lines (hduser...), no effect. Weirdly
>>> enough, when I check with the command *"ulimit -n"* it returns the
>>> correct value of 100.
>>>
>>> I then added *"ulimit -n 100"* to *"spark-env.sh"* in the master
>>> and in each of my workers, no effect.
>>>
>>> What else could it be besides changing the ulimit setting? if it's only
>>> that, what could cause Spark to ignore it?
>>>
>>> I'll appreciate any help in advance.
>>>
>>> --
>>> *PhD Student - EIS Group - Bonn University, Germany.*
>>> *+49 1575 8482232 <%2B49%201575%208482232>*
>>>
>>>
>>
>
>
>
> [image: What's New with Xactly] 
>
>   [image: LinkedIn]
>   [image: Twitter]
>   [image: Facebook]
>   [image: YouTube]
> 


Re: Too many open files, why changing ulimit not effecting?

2016-02-05 Thread Nirav Patel
For centos there's also /etc/security/limits.d/90-nproc.conf  that may need
modifications.

Services that you expect to use new limits needs to be restarted. Simple
thing to do is to reboot the machine.

On Fri, Feb 5, 2016 at 3:59 AM, Ted Yu  wrote:

> bq. and *"session required pam_limits.so"*.
>
> What was the second file you modified ?
>
> Did you make the change on all the nodes ?
>
> Please see the verification step in
> https://easyengine.io/tutorials/linux/increase-open-files-limit/
>
> On Fri, Feb 5, 2016 at 1:42 AM, Mohamed Nadjib MAMI 
> wrote:
>
>> Hello all,
>>
>> I'm getting the famous *java.io.FileNotFoundException: ... (Too many
>> open files) *exception. What seemed to have helped people out, it
>> haven't for me. I tried to set the ulimit via the command line *"ulimit
>> -n"*, then I tried to add the following lines to
>> *"/etc/security/limits.conf"* file:
>>
>> ** - nofile 100*
>> *root soft nofile 100*
>> *root hard nofile 100*
>> *hduser soft nofile 100*
>> *hduser hard nofile 100*
>>
>> ...then I added this line *"session required pam_limits.so"* to the two
>> files* "/etc/pam.d/common-session"* and *"session required
>> pam_limits.so"*. The I logged-out/logged-in. First, I tried only the
>> first line (** - nofile 100**)*, then added the 2nd and the 3rd
>> (root...),  then added the last two lines (hduser...), no effect. Weirdly
>> enough, when I check with the command *"ulimit -n"* it returns the
>> correct value of 100.
>>
>> I then added *"ulimit -n 100"* to *"spark-env.sh"* in the master and
>> in each of my workers, no effect.
>>
>> What else could it be besides changing the ulimit setting? if it's only
>> that, what could cause Spark to ignore it?
>>
>> I'll appreciate any help in advance.
>>
>> --
>> *PhD Student - EIS Group - Bonn University, Germany.*
>> *+49 1575 8482232 <%2B49%201575%208482232>*
>>
>>
>

-- 


[image: What's New with Xactly] 

  [image: LinkedIn] 
  [image: Twitter] 
  [image: Facebook] 
  [image: YouTube] 



Too many open files, why changing ulimit not effecting?

2016-02-05 Thread Mohamed Nadjib MAMI

Hello all,

I'm getting the famous /java.io.FileNotFoundException: ... (Too many 
open files) /exception. What seemed to have helped people out, it 
haven't for me. I tried to set the ulimit via the command line /"ulimit 
-n"/, then I tried to add the following lines to 
/"/etc/security/limits.conf"/ file:

/
//* - nofile 100//
//root soft nofile 100//
//root hard nofile 100//
//hduser soft nofile 100//
//hduser hard nofile 100/

...then I added this line /"session required pam_limits.so"/ to the two 
files/"/etc/pam.d/common-session"/ and /"session required 
pam_limits.so"/. The I logged-out/logged-in. First, I tried only the 
first line (/* - nofile 100//)/, then added the 2nd and the 3rd 
(root...),  then added the last two lines (hduser...), no effect. 
Weirdly enough, when I check with the command /"ulimit -n"/ it returns 
the correct value of 100.


I then added /"ulimit -n 100"/ to /"spark-env.sh"/ in the master and 
in each of my workers, no effect.


What else could it be besides changing the ulimit setting? if it's only 
that, what could cause Spark to ignore it?


I'll appreciate any help in advance.

--
/PhD Student - EIS Group - Bonn University, Germany.//
//+49 1575 8482232/



Re: Too many open files, why changing ulimit not effecting?

2016-02-05 Thread Ted Yu
bq. and *"session required pam_limits.so"*.

What was the second file you modified ?

Did you make the change on all the nodes ?

Please see the verification step in
https://easyengine.io/tutorials/linux/increase-open-files-limit/

On Fri, Feb 5, 2016 at 1:42 AM, Mohamed Nadjib MAMI 
wrote:

> Hello all,
>
> I'm getting the famous *java.io.FileNotFoundException: ... (Too many open
> files) *exception. What seemed to have helped people out, it haven't for
> me. I tried to set the ulimit via the command line *"ulimit -n"*, then I
> tried to add the following lines to *"/etc/security/limits.conf"* file:
>
> ** - nofile 100*
> *root soft nofile 100*
> *root hard nofile 100*
> *hduser soft nofile 100*
> *hduser hard nofile 100*
>
> ...then I added this line *"session required pam_limits.so"* to the two
> files* "/etc/pam.d/common-session"* and *"session required pam_limits.so"*.
> The I logged-out/logged-in. First, I tried only the first line (** -
> nofile 100**)*, then added the 2nd and the 3rd (root...),  then added
> the last two lines (hduser...), no effect. Weirdly enough, when I check
> with the command *"ulimit -n"* it returns the correct value of 100.
>
> I then added *"ulimit -n 100"* to *"spark-env.sh"* in the master and
> in each of my workers, no effect.
>
> What else could it be besides changing the ulimit setting? if it's only
> that, what could cause Spark to ignore it?
>
> I'll appreciate any help in advance.
>
> --
> *PhD Student - EIS Group - Bonn University, Germany.*
> *+49 1575 8482232 <%2B49%201575%208482232>*
>
>