After cloning today's version of spark-master, I run the following command: 
S:\spark-master>sbt ./build/sbt -Pyarn -Phadoop-2.7 -Dhadoop.version=2.7.0
-Phive -Phive-thriftserver clean package
with the intention of building both the source and test projects and
generating the corresponding .jar files.

The script started regularly, but ultimately failed with the following log
excerpt:
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m;
support was removed in
8.0
[info] Loading project definition from S:\spark-master\project
[info] Resolving key references (16939 settings) ...
[info] Set current project to spark-parent (in build file:/S:/spark-master/)
[error] Expected letter
[error] Expected symbol
[error] Expected '!'
[error] Expected '+'
[error] Expected '++'
[error] Expected '^'
[error] Expected '^^'
[error] Expected 'debug'
[error] Expected 'info'
[error] Expected 'warn'
[error] Expected 'error'
[error] Expected ';'
[error] Expected end of input.
[error] Expected 'early('
[error] Expected '-'
[error] Expected '--'
[error] Expected 'show'
[error] Expected 'all'
[error] Expected '*'
[error] Expected '{'
[error] Expected project ID
[error] Expected configuration
[error] Expected key
[error] ./build/sbt
[error] ^ 


I tried to follow the instructions found at
http://www.sparktutorials.net/building-apache-spark-on-your-local-machine to
the best of my understanding, but I don't know how to interpret the error
and where to begin the troubleshooting.

I'm using eclipse as my IDE, so both scala and java seem to be setup
properly. Although, after running out of options, I simply run
S:\spark-master>sbt compile, which failed with these errors (I don't know
whether this is relevant):
[error] (core/compile:managedResources) java.io.IOException: Cannot run
program "bash": CreateProcess error=2, The system cannot find the file
specified
[error] (network-common/compile:compileIncremental) java.io.IOException:
Cannot run program "S:\Program Files\Java\bin\javac" (in directory
"S:\spark-master"): CreateProcess error=2, The system cannot find the file
specified
[error] (tags/compile:compileIncremental) java.io.IOException: Cannot run
program "S:\Program Files\Java\bin\javac" (in directory "S:\spark-master"):
CreateProcess error=2, The system cannot find the file specified

Note that javac is located in S:\Program Files\Java\jdk1.8.0_77\bin.

So, I would appreciate help in building and packaging the src and test
components of the spark source.






--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to