Cool.
On 29 Jun 2015 21:10, "郭谦" <buptguoq...@gmail.com> wrote:

> Akhil Das,
>
> You give me a new idea to solve the problem.
>
> Vova provides me a way to solve the problem just before
>
> Vova Shelgunov<vvs...@gmail.com>
>
> Sample code for submitting job from any other java app, e.g. servlet:
>
> http://pastebin.com/X1S28ivJ
>
> I appreciate your help.
>
> Thanks
> Best Regards
>
> 2015-06-29 15:24 GMT+08:00, Akhil Das <ak...@sigmoidanalytics.com>:
> > You can create a SparkContext in your program and run it as a standalone
> > application without using spark-submit.
> >
> > Here's something that will get you started:
> >
> >    //Create SparkContext
> >     val sconf = new SparkConf()
> >       .setMaster("spark://spark-ak-master:7077")
> >       .setAppName("Test")
> >       .set("spark.cores.max", "12")
> >       .set("spark.executor.memory", "12g")
> >
> >     val sc = new SparkContext(sconf)
> >
> >     //This needs to be added or else will throw up class not found
> > exceptions
> >     sc.addJar("target/scala-2.10/yourprojectjar_2.10-1.0.jar")
> >
> >
> >
> >
> >
> > Thanks
> > Best Regards
> >
> > On Sun, Jun 28, 2015 at 1:26 PM, 郭谦 <buptguoq...@gmail.com> wrote:
> >
> >> HI,
> >>
> >> I'm a junior user of spark from China.
> >>
> >> I have a problem about submit spark job right now. I want to submit job
> >> from code.
> >>
> >> In other words ,"How to submit spark job from within java program to
> yarn
> >> cluster without using spark-submit"
> >>
> >>
> >>        I've learnt from official site
> >> http://spark.apache.org/docs/latest/submitting-applications.html
> >>
> >> that using  bin/spark-submit script to submit a job to cluster is easy .
> >>
> >>
> >>        Because the script may does lots of complex work such as setting
> >> up
> >> the classpath with Spark and its dependencies.
> >>
> >> If I don't use the script ,I have to deal with all complex work by
> >> myself.It makes me feel really frustrated.
> >>
> >>
> >>        I have search this problem from Google,but the answers may not
> >> suit
> >> for me .
> >>
> >>
> >>        In hadoop developing ,I know that after setting up Configuration
> >> ,Job and resources ,
> >>
> >> we can submit hadoop job by coding like this:
> >>
> >> job.waitForCompletion
> >>
> >> It is convenient for users to submit job programmatically
> >>
> >>
> >> I want to know if there is a schedule( may be in spark 1.5+?)that
> provide
> >> users variety ways of submitting job like hadoop .
> >>
> >> Like monitoring ,In the recent release spark(1.4.0) We can get
> statements
> >> about spark applications by REST API right now.
> >>
> >>
> >> Thanks & Regards
> >>
> >> GUO QIAN
> >
>

Reply via email to