Spark-submit --master yarn-cluster

Look docs for more details
On Dec 17, 2015 5:00 PM, "Forest Fang" <forest.f...@outlook.com> wrote:

> Maybe I'm not understanding your question correctly but would it be
> possible for you to piece up your job submission information as if you are
> operating spark-submit? If so, you could just call
>  org.apache.spark.deploy.SparkSubmit and pass your regular spark-submit
> arguments.
>
> This is how I do it with my sbt plugin which allows you to codify a
> spark-submit command in sbt build so the JAR gets automatically rebuilt and
> potentially redeployed every time you submit a Spark job using a custom sbt
> task:
> https://github.com/saurfang/sbt-spark-submit/blob/master/src/main/scala/sbtsparksubmit/SparkSubmitPlugin.scala#L85
>
>
> ------------------------------
> Subject: Re: How to submit spark job to YARN from scala code
> From: ste...@hortonworks.com
> CC: user@spark.apache.org
> Date: Thu, 17 Dec 2015 19:45:16 +0000
>
>
> On 17 Dec 2015, at 16:50, Saiph Kappa <saiph.ka...@gmail.com> wrote:
>
> Hi,
>
> Since it is not currently possible to submit a spark job to a spark
> cluster running in standalone mode (cluster mode - it's not currently
> possible to specify this deploy mode within the code), can I do it with
> YARN?
>
> I tried to do something like this (but in scala):
>
> «
>
> ... // Client object - main methodSystem.setProperty("SPARK_YARN_MODE", 
> "true")val sparkConf = new SparkConf()try {  val args = new 
> ClientArguments(argStrings, sparkConf)  new Client(args, sparkConf).run()} 
> catch {  case e: Exception => {    Console.err.println(e.getMessage)    
> System.exit(1)  }}System.exit(0)
>
> » in http://blog.sequenceiq.com/blog/2014/08/22/spark-submit-in-java/
>
> However it is not possible to create a new instance of Client since import 
> org.apache.spark.deploy.yarn.Client is private
>
>
> the standard way to work around a problem like this is to place your code
> in a package which has access. File a JIRA asking for a public API too —one
> that doesn't require you to set system properties as a way of passing
> parameters down
>
>
> Is there any way I can submit spark jobs from the code in cluster mode and 
> not using the spark-submit script?
>
> Thanks.
>
>
>

Reply via email to