This is a false error message actually - the Maven build no longer requires SCALA_HOME but the message/check was still there. This was fixed recently in master:
https://github.com/apache/spark/commit/d8c005d5371f81a2a06c5d27c7021e1ae43d7193 I can back port that fix into branch-1.0 so it will be in 1.0.1 as well. For other people running into this, you can export SCALA_HOME to any value and it will work. - Patrick On Sat, May 31, 2014 at 8:34 PM, Colin McCabe <[email protected]> wrote: > Spark currently supports two build systems, sbt and maven. sbt will > download the correct version of scala, but with Maven you need to supply it > yourself and set SCALA_HOME. > > It sounds like the instructions need to be updated-- perhaps create a JIRA? > > best, > Colin > > > On Sat, May 31, 2014 at 7:06 PM, Soren Macbeth <[email protected]> wrote: > >> Hello, >> >> Following the instructions for building spark 1.0.0, I encountered the >> following error: >> >> [ERROR] Failed to execute goal >> org.apache.maven.plugins:maven-antrun-plugin:1.7:run (default) on project >> spark-core_2.10: An Ant BuildException has occured: Please set the >> SCALA_HOME (or SCALA_LIBRARY_PATH if scala is on the path) environment >> variables and retry. >> [ERROR] around Ant part ...<fail message="Please set the SCALA_HOME (or >> SCALA_LIBRARY_PATH if scala is on the path) environment variables and >> retry.">... @ 6:126 in >> /Users/soren/src/spark-1.0.0/core/target/antrun/build-main.xml >> >> No where in the documentation does it mention that having scala installed >> and either of these env vars set nor what version should be installed. >> Setting these env vars wasn't required for 0.9.1 with sbt. >> >> I was able to get past it by downloading the scala 2.10.4 binary package to >> a temp dir and setting SCALA_HOME to that dir. >> >> Ideally, it would be nice to not have to require people to have a >> standalone scala installation but at a minimum this requirement should be >> documented in the build instructions no? >> >> -Soren >>
