Threads like these are great candidates to be part of the "Contributors
guide". I will create a JIRA to update the guide with data past threads
like these.

Sujeet


On Mon, May 19, 2014 at 7:10 PM, Patrick Wendell <pwend...@gmail.com> wrote:

> Whenever we publish a release candidate, we create a temporary maven
> repository that host the artifacts. We do this precisely for the case
> you are running into (where a user wants to build an application
> against it to test).
>
> You can build against the release candidate by just adding that
> repository in your sbt build, then linking against "spark-core"
> version "1.0.0". For rc9 the repository is in the vote e-mail:
>
>
> http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-0-0-rc9-td6629.html
>
> On Mon, May 19, 2014 at 7:03 PM, Mark Hamstra <m...@clearstorydata.com>
> wrote:
> > That's the crude way to do it.  If you run `sbt/sbt publishLocal`, then
> you
> > can resolve the artifact from your local cache in the same way that you
> > would resolve it if it were deployed to a remote cache.  That's just the
> > build step.  Actually running the application will require the necessary
> > jars to be accessible by the cluster nodes.
> >
> >
> > On Mon, May 19, 2014 at 7:04 PM, Nan Zhu <zhunanmcg...@gmail.com> wrote:
> >
> >> en, you have to put spark-assembly-*.jar to the lib directory of your
> >> application
> >>
> >> Best,
> >>
> >> --
> >> Nan Zhu
> >>
> >>
> >> On Monday, May 19, 2014 at 9:48 PM, nit wrote:
> >>
> >> > I am not much comfortable with sbt. I want to build a standalone
> >> application
> >> > using spark 1.0 RC9. I can build sbt assembly for my application with
> >> Spark
> >> > 0.9.1, and I think in that case spark is pulled from Aka Repository?
> >> >
> >> > Now if I want to use 1.0 RC9 for my application; what is the process ?
> >> > (FYI, I was able to build spark-1.0 via sbt/assembly and I can see
> >> > sbt-assembly jar; and I think I will have to copy my jar somewhere?
> and
> >> > update build.sbt?)
> >> >
> >> > PS: I am not sure if this is the right place for this question; but
> since
> >> > 1.0 is still RC, I felt that this may be appropriate forum.
> >> >
> >> > thank!
> >> >
> >> >
> >> >
> >> > --
> >> > View this message in context:
> >>
> http://apache-spark-developers-list.1001551.n3.nabble.com/spark-1-0-standalone-application-tp6698.html
> >> > Sent from the Apache Spark Developers List mailing list archive at
> >> Nabble.com (http://Nabble.com).
> >> >
> >> >
> >>
> >>
> >>
>

Reply via email to