I built Spark 1.2.0 succesfully, but was unable to build my Spark program
under 1.2.0 with sbt assembly & my build.sbt file. It contains:

I tried:
    "org.apache.spark" %% "spark-sql" % "1.2.0",
    "org.apache.spark" %% "spark-core" % "1.2.0",

and

    "org.apache.spark" %% "spark-sql" % "1.2.0-SNAPSHOT",
    "org.apache.spark" %% "spark-core" % "1.2.0-SNAPSHOT",

but I get errors like:
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] ::          UNRESOLVED DEPENDENCIES         ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: org.apache.spark#spark-sql_2.10;1.2.0: not found
[warn] :: org.apache.spark#spark-core_2.10;1.2.0: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::

sbt.ResolveException: unresolved dependency:
org.apache.spark#spark-sql_2.10;1.2.0: not found
unresolved dependency: org.apache.spark#spark-core_2.10;1.2.0: not found
...
[error] (*:update) sbt.ResolveException: unresolved dependency:
org.apache.spark#spark-sql_2.10;1.2.0: not found
[error] unresolved dependency: org.apache.spark#spark-core_2.10;1.2.0: not
found



Do I need to configure my build.sbt to point to my local spark 1.2.0
repository? How?

Thanks,
- Arun

Reply via email to