Hi Jeff,

Thanks for your info! I am developing a workflow system based on Oozie, but it 
only supports java and mapreduce now, so I want to run spark job as in local 
mode by the workflow system first, then extend the workflow system to run spark 
job on Yarn.

Best wishes,
Fei

 
> On Mar 29, 2016, at 3:47 AM, Jeff Zhang <zjf...@gmail.com> wrote:
> 
> Yes you can. But this is actually what spark-submit does for you. Actually 
> spark-submit do more than that.  
> You can refer here 
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
>  
> <https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala>
> 
> What's your purpose for using "java -cp", for local development, IDE should 
> be sufficient. 
> 
> 
> 
> 
> 
> On Tue, Mar 29, 2016 at 12:26 PM, Fei Hu <hufe...@gmail.com 
> <mailto:hufe...@gmail.com>> wrote:
> Hi,
> 
> I am wondering how to run the spark job by java command, such as: java -cp 
> spark.jar mainclass. When running/debugging the spark program in IntelliJ 
> IDEA, it uses java command to run spark main class, so I think it should be 
> able to run the spark job by java command besides the spark-submit command.
> 
> Thanks in advance,
> Fei
> 
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
> <mailto:user-unsubscr...@spark.apache.org>
> For additional commands, e-mail: user-h...@spark.apache.org 
> <mailto:user-h...@spark.apache.org>
> 
> 
> 
> 
> -- 
> Best Regards
> 
> Jeff Zhang

Reply via email to