Dunno about having the application be independent of whether spark-submit
is still alive, but you can have spark-submit run in a new session in Linux
using setsid <http://unix.stackexchange.com/a/28877/70630>.

That way even if you terminate your SSH session, spark-submit will keep
running independently. Of course, if you terminate the host running
spark-submit, you will still have problems.
​

On Thu, Sep 18, 2014 at 4:19 AM, Tobias Pfeiffer <t...@preferred.jp> wrote:

> Hi,
>
> I am wondering: Is it possible to run spark-submit in a mode where it will
> start an application on a YARN cluster (i.e., driver and executors run on
> the cluster) and then forget about it in the sense that the Spark
> application is completely independent from the host that ran the
> spark-submit command and will not be affected if that controlling machine
> shuts down etc.? I was using spark-submit with YARN in cluster mode, but
> spark-submit stayed in the foreground and as far as I understood, it
> terminated the application on the cluster when spark-submit was Ctrl+C'ed.
>
> Thanks
> Tobias
>

Reply via email to