;> Thanks,
>> Mike
>>
>> From: Mayur Rustagi
>> Reply-To: "user@spark.incubator.apache.org" <
>> user@spark.incubator.apache.org>
>> Date: Thursday, February 13, 2014 12:58 PM
>> To: "user@spark.incubator.apache.org"
>>
r@spark.incubator.apache.org>
> Date: Thursday, February 13, 2014 12:58 PM
> To: "user@spark.incubator.apache.org"
> Subject: Re: [External] Re: Too many open files
>
> Easiest is ulimit -n 18000 to your conf/spark-env.sh
> Restart the cluster.. make sure you change the
er@spark.incubator.apache.org<mailto:user@spark.incubator.apache.org>"
mailto:user@spark.incubator.apache.org>>
Date: Thursday, February 13, 2014 12:58 PM
To: "user@spark.incubator.apache.org<mailto:user@spark.incubator.apache.org>"
mailto:user@spark.incubator.apache.o
rsday, February 13, 2014 12:34 PM
> To: "user@spark.incubator.apache.org"
> Subject: [External] Re: Too many open files
>
> The limit could be on any of the machines(including the master). Do you
> have ganglia setup?
>
> Mayur Rustagi
> Ph: +919632149971
> h <ht
it = unlimited.
>
> From: Mayur Rustagi
> Reply-To: "user@spark.incubator.apache.org" <
> user@spark.incubator.apache.org>
> Date: Thursday, February 13, 2014 12:34 PM
> To: "user@spark.incubator.apache.org"
> Subject: [External] Re: Too many open files
rsday, February 13, 2014 12:34 PM
To: "user@spark.incubator.apache.org<mailto:user@spark.incubator.apache.org>"
mailto:user@spark.incubator.apache.org>>
Subject: [External] Re: Too many open files
The limit could be on any of the machines(including the master). Do you have