, November 26, 2014 10:03 PM
To: Naveen Kumar Pokala
Cc: user@spark.apache.org mailto:user@spark.apache.org
Subject: Re: Spark Job submit
How about?
- Create a SparkContext
- setMaster as yarn-cluster
- Create a JavaSparkContext with the above SparkContext
Hi.
Is there a way to submit spark job on Hadoop-YARN cluster from java code.
-Naveen
How about?
- Create a SparkContext
- setMaster as *yarn-cluster*
- Create a JavaSparkContext with the above SparkContext
And that will submit it to the yarn cluster.
Thanks
Best Regards
On Wed, Nov 26, 2014 at 4:20 PM, Naveen Kumar Pokala
npok...@spcapitaliq.com wrote:
Hi.
Is there a
I think that actually would not work - yarn-cluster mode expects a specific
deployment path that uses SparkSubmit. Setting master as yarn-client should
work.
-Sandy
On Wed, Nov 26, 2014 at 8:32 AM, Akhil Das ak...@sigmoidanalytics.com
wrote:
How about?
- Create a SparkContext
- setMaster as
sample code if you have any.
-Naveen
*From:* Akhil Das [mailto:ak...@sigmoidanalytics.com]
*Sent:* Wednesday, November 26, 2014 10:03 PM
*To:* Naveen Kumar Pokala
*Cc:* user@spark.apache.org
*Subject:* Re: Spark Job submit
How about?
- Create a SparkContext
- setMaster as *yarn