We are running spark on mesos, and encounter too many open files error.

I have already set the hard limit of nofile at /etc/security/limits.conf
`* hard nofile 65536`
and increase the soft nofile limit in the spark-env.sh, but not work
`ulimit -n 65536`

stderr when executor start: `spark-env.sh: line 6: ulimit: open files:
cannot modify limit: Operation not permitted`

mesos doc says that posix rlimit specified as part of a
taks's ContainerInfo, so increase rlimit with ulimit command in the
spark-env.sh not work?
https://mesos.apache.org/documentation/latest/isolators/posix-rlimits/

I think that start mesos agent with a higher nofile limit will solve the
problem, but maybe it should set with spark properties?

-- 
*camper42 (Fengyu Cao)*
Douban, Inc.

Mobile: +86 15691996359
E-mail:  camper.x...@gmail.com

Reply via email to