Hello there

I have the same requirement.

I submit a streaming job with yarn-cluster mode.

If I want to shutdown this endless YARN application, I should find out the
application id by myself and use "yarn appplication -kill <app_id>" to kill
the application.

Therefore, if I can get returned application id in my client program, it
will be easy for me to kill YARN application from my client program.

Kyle



2015-06-24 13:02 GMT+08:00 canan chen <ccn...@gmail.com>:

> I don't think there is yarn related stuff to access in spark.  Spark don't
> depend on yarn.
>
> BTW, why do you want the yarn application id ?
>
> On Mon, Jun 22, 2015 at 11:45 PM, roy <rp...@njit.edu> wrote:
>
>> Hi,
>>
>>   Is there a way to get Yarn application ID inside spark application, when
>> running spark Job on YARN ?
>>
>> Thanks
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Yarn-application-ID-for-Spark-job-on-Yarn-tp23429.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to