Can this be done?  Can I just spin up a SparkContext programmatically, point 
this to my yarn-cluster and this works like spark-submit??  Doesn’t (at least) 
the application JAR need to be distributed to the workers via HDFS or the like 
for the jobs to run?

mn

> On Oct 28, 2014, at 2:29 AM, Akhil Das <ak...@sigmoidanalytics.com> wrote:
> 
> How about directly running it?
> 
>     val ssc = new StreamingContext("local[2]","Network WordCount",Seconds(5),
>       "/home/akhld/mobi/localclusterxx/spark-1")
> 
> 
>     val lines=ssc.socketTextStream("localhost", 12345)
> 
>     val words = lines.flatMap(_.split(" "))
>     val wordCounts = words.map(x => (x, 1)).reduceByKey(_ + _)
>     wordCounts.print()
> 
>     ssc.start()
>     ssc.awaitTermination()
> 
> Thanks
> Best Regards
> 
> On Tue, Oct 28, 2014 at 1:50 PM, sivarani <whitefeathers...@gmail.com 
> <mailto:whitefeathers...@gmail.com>> wrote:
> Hi,
> 
> i am submitting spark application in the following fashion
> 
> bin/spark-submit --class "NetworkCount" --master spark://abc.test.com:7077 
> <http://abc.test.com:7077/>
> try/simple-project/target/simple-project-1.0-jar-with-dependencies.jar
> 
> But is there any other way to submit spark application through the code?
> 
> like for example i am checking for a condition if true i wanted to run the
> spark application
> 
> (isConditionTrue){
>    runSpark("NetworkCount","masterurl","jar")
> }
> 
> I am aware we can set the jar and master url with spark context, but how to
> run it from code automatically when a condition comes true without actually
> using spark-submit
> 
> Is it possible?
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Submiting-Spark-application-through-code-tp17452.html
>  
> <http://apache-spark-user-list.1001560.n3.nabble.com/Submiting-Spark-application-through-code-tp17452.html>
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
> <mailto:user-unsubscr...@spark.apache.org>
> For additional commands, e-mail: user-h...@spark.apache.org 
> <mailto:user-h...@spark.apache.org>
> 
> 

Reply via email to