Or you can use livy for submit spark jobs

http://livy.io/



Linyuxin <linyu...@huawei.com>于2016年12月26日周一 上午10:32写道:

> Thanks.
>
>
>
> *发件人:* Naveen [mailto:hadoopst...@gmail.com]
> *发送时间:* 2016年12月25日 0:33
> *收件人:* Linyuxin <linyu...@huawei.com>
> *抄送:* user <user@spark.apache.org>
> *主题:* Re: 答复: submit spark task on yarn asynchronously via java?
>
>
>
> Hi,
>
> Please use SparkLauncher API class and invoke the threads using async
> calls using Futures.
>
> Using SparkLauncher, you can mention class name, application resouce,
> arguments to be passed to the driver, deploy-mode etc.
>
> I would suggest to use scala's Future, is scala code is possible.
>
>
>
> https://spark.apache.org/docs/1.5.1/api/java/org/apache/spark/launcher/SparkLauncher.html
> https://docs.oracle.com/javase/7/docs/api/java/util/concurrent/Future.html
>
>
> On Fri, Dec 23, 2016 at 7:10 AM, Linyuxin <linyu...@huawei.com> wrote:
>
> Hi,
>
> Could Anybody help?
>
>
>
> *发件人**:* Linyuxin
> *发送时间**:* 2016年12月22日 14:18
> *收件人**:* user <user@spark.apache.org>
> *主题**:* submit spark task on yarn asynchronously via java?
>
>
>
> Hi All,
>
>
>
> Version:
>
> Spark 1.5.1
>
> Hadoop 2.7.2
>
>
>
> Is there any way to submit and monitor spark task on yarn via java
> asynchronously?
>
>
>
>
>
>
>

Reply via email to