Hey Arun,

Since this build depends on unpublished builds of spark (1.2.0-SNAPSHOT),
you'll need to first build spark and "publish-local" so your application
build can find those SNAPSHOTs in your local repo.

Just append "publish-local" to your sbt command where you build Spark.

-Pat



On Wed, Oct 8, 2014 at 5:35 PM, Arun Luthra <arun.lut...@gmail.com> wrote:

> I built Spark 1.2.0 succesfully, but was unable to build my Spark program
> under 1.2.0 with sbt assembly & my build.sbt file. It contains:
>
> I tried:
>     "org.apache.spark" %% "spark-sql" % "1.2.0",
>     "org.apache.spark" %% "spark-core" % "1.2.0",
>
> and
>
>     "org.apache.spark" %% "spark-sql" % "1.2.0-SNAPSHOT",
>     "org.apache.spark" %% "spark-core" % "1.2.0-SNAPSHOT",
>
> but I get errors like:
> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
> [warn] ::          UNRESOLVED DEPENDENCIES         ::
> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
> [warn] :: org.apache.spark#spark-sql_2.10;1.2.0: not found
> [warn] :: org.apache.spark#spark-core_2.10;1.2.0: not found
> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>
> sbt.ResolveException: unresolved dependency:
> org.apache.spark#spark-sql_2.10;1.2.0: not found
> unresolved dependency: org.apache.spark#spark-core_2.10;1.2.0: not found
> ...
> [error] (*:update) sbt.ResolveException: unresolved dependency:
> org.apache.spark#spark-sql_2.10;1.2.0: not found
> [error] unresolved dependency: org.apache.spark#spark-core_2.10;1.2.0: not
> found
>
>
>
> Do I need to configure my build.sbt to point to my local spark 1.2.0
> repository? How?
>
> Thanks,
> - Arun
>

Reply via email to