On Mon, 20.02.17 16:44, Rao Vz (raoa...@gmail.com) wrote:

> Hi, Guys
> 
> We have a Apache Spark cluster of 3 nodes, one is master and slave, the
> other two are slaves. When starting Spark worker with "systemctl start
> spark-worker", when running out apps, sometimes but not always it generates
> "java.lang.OutOfMemoryError: unable to create new native thread" error in
> Spark worker logs.

I figure the error is misleading and is not about memory at all, and
you need to bump the default TasksMax= field or even turn it off by
setting it to infinity.

Lennart

-- 
Lennart Poettering, Red Hat
_______________________________________________
systemd-devel mailing list
systemd-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/systemd-devel

Reply via email to