Changed to infinity and the error seems gone. Will keep monitoring for a
few days ..

Thanks guys!

Rao

On Tue, Feb 21, 2017 at 3:42 AM, Lennart Poettering <lenn...@poettering.net>
wrote:

> On Mon, 20.02.17 16:44, Rao Vz (raoa...@gmail.com) wrote:
>
> > Hi, Guys
> >
> > We have a Apache Spark cluster of 3 nodes, one is master and slave, the
> > other two are slaves. When starting Spark worker with "systemctl start
> > spark-worker", when running out apps, sometimes but not always it
> generates
> > "java.lang.OutOfMemoryError: unable to create new native thread" error in
> > Spark worker logs.
>
> I figure the error is misleading and is not about memory at all, and
> you need to bump the default TasksMax= field or even turn it off by
> setting it to infinity.
>
> Lennart
>
> --
> Lennart Poettering, Red Hat
>
_______________________________________________
systemd-devel mailing list
systemd-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/systemd-devel

Reply via email to