I am trying to bring up a standalone cluster on Windows 8.1/Windows Server 2013
and I am having troubles getting the make-distribution script to complete
successfully. It seems to change a bunch of permissions in /dist and then tries
to write to it unsuccessfully. I assume this is not expected behavior.
Any thoughts are greatly appreciated.
Here is my workflow (and results):
Building/Making a Spark distribution
1. Install Cygwin (64-bit)
2. Install JDK 7u45
a. Add JDK to PATH (i.e. set PATH=%PATH%;C:\Program
Files\Java\jdk1.7.0_45\bin)
3. Install GIT 1.8.5.1
4. Download Spark 0.8.0-inclubating
5. Right-click on downloaded spark file, go to properties, and click on
"Unblock"
6. Extract to d:\spark
11. Open a DOS command prompt as administrator
12. c:\> cd /d d:\spark
13. d:\spark> make-distribution.sh
Welcome to Git (version 1.8.4-preview20130916)
Run 'git help git' to display the help index.
Run 'git help <command>' to display help for specific commands.
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
Making distribution for Spark Could in /d/spark/dist...
Hadoop version set to 1.0.4
YARN disabled
Error occurred during initialization of VM
Could not reserve enough space for object heap
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
rm: cannot change to directory `/d/spark/dist/bin': Permission denied
rm: cannot change to directory `/d/spark/dist/python': Permission denied
rm: cannot remove directory `/d/spark/dist': Directory not empty
cp: cannot stat `/d/spark/assembly/target/scala*/*assembly*hadoop*.jar': No
such file or directory
cp: cannot stat `/d/spark/conf/*.template': No such file or directory
cp: cannot create regular file `/d/spark/dist/bin/compute-classpath.cmd':
Permission denied
cp: cannot create regular file `/d/spark/dist/bin/compute-classpath.sh':
Permission denied
cp: cannot create regular file `/d/spark/dist/bin/slaves.sh': Permission denied
cp: cannot create regular file `/d/spark/dist/bin/spark-config.sh': Permission
denied
cp: cannot create regular file `/d/spark/dist/bin/spark-daemon.sh': Permission
denied
cp: cannot create regular file `/d/spark/dist/bin/spark-daemons.sh': Permission
denied
.
.
.
cp: cannot create regular file `/d/spark/dist/python/pyspark/__init__.py':
Permission denied
cp: cannot create regular file `/d/spark/dist/python/run-tests': Permission
denied
cp: cannot create regular file `/d/spark/dist/python/test_support/hello.txt':
Permission denied
cp: cannot create regular file
`/d/spark/dist/python/test_support/userlib-0.1-py2.7.egg': Permission denied
cp: cannot create regular file
`/d/spark/dist/python/test_support/userlibrary.py': Permission denied
14. d:\spark> sbt\sbt assembly
...
[info] Compiling 10 Scala sources to D:\spark\repl\target\scala-2.9.3\classes...
[info] Compiling 39 Scala sources and 13 Java sources to
D:\spark\examples\target\scala-2.9.3\classes...
sbt appears to be exiting abnormally.
The log file for this session is at
C:\Users\adribona\AppData\Local\Temp\sbt7403459132156612750.log
java.lang.OutOfMemoryError: PermGen space
at java.util.concurrent.FutureTask.report(Unknown Source)
at java.util.concurrent.FutureTask.get(Unknown Source)
at
sbt.ConcurrentRestrictions$$anon$4.take(ConcurrentRestrictions.scala:196)
at sbt.Execute.next$1(Execute.scala:85)
at sbt.Execute.processAll(Execute.scala:88)
at sbt.Execute.runKeep(Execute.scala:68)
at sbt.EvaluateTask$.run$1(EvaluateTask.scala:162)
at sbt.EvaluateTask$.runTask(EvaluateTask.scala:177)
at sbt.Aggregation$$anonfun$4.apply(Aggregation.scala:46)
at sbt.Aggregation$$anonfun$4.apply(Aggregation.scala:44)
at sbt.EvaluateTask$.withStreams(EvaluateTask.scala:137)
at sbt.Aggregation$.runTasksWithResult(Aggregation.scala:44)
at sbt.Aggregation$.runTasks(Aggregation.scala:59)
at sbt.Aggregation$$anonfun$applyTasks$1.apply(Aggregation.scala:31)
at sbt.Aggregation$$anonfun$applyTasks$1.apply(Aggregation.scala:30)
at
sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:62)
at
sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:62)
at sbt.Command$.process(Command.scala:90)
at
sbt.MainLoop$$anonfun$next$1$$anonfun$apply$1.apply(MainLoop.scala:71)
at
sbt.MainLoop$$anonfun$next$1$$anonfun$apply$1.apply(MainLoop.scala:71)
at sbt.State$$anon$2.process(State.scala:171)
at sbt.MainLoop$$anonfun$next$1.apply(MainLoop.scala:71)
at sbt.MainLoop$$anonfun$next$1.apply(MainLoop.scala:71)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.MainLoop$.next(MainLoop.scala:71)
at sbt.MainLoop$.run(MainLoop.scala:64)
at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:53)
at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:50)
at sbt.Using.apply(Using.scala:25)
at sbt.MainLoop$.runWithNewLog(MainLoop.scala:50)
at sbt.MainLoop$.runAndClearLast(MainLoop.scala:33)
at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:17)
Error during sbt execution: java.lang.OutOfMemoryError: PermGen space
15. d:\spark> set _JAVA_OPTIONS="-Xmx512M"
16. d:\spark> sbt\sbt assembly
...
[info] SHA-1: 60c513d9b066c9e7a9b51d6282a8e4f9f55b62e6
[info] Packaging
D:\spark\examples\target\scala-2.9.3\spark-examples-assembly-0.8.0-incubating.jar
...
[info] Done packaging.
[success] Total time: 319 s, completed Dec 4, 2013 12:44:18 PM
17. d:\spark> make-distribution.sh
...
[warn] Strategy 'concat' was applied to a file
[warn] Strategy 'discard' was applied to 2 files
[warn] Strategy 'first' was applied to 212 files
[info] Checking every *.class/*.jar file's SHA-1.
[info] Assembly up to date:
D:\spark\assembly\target\scala-2.9.3\spark-assembly-0.8.0-incubating-hadoop1.0.4.jar
[success] Total time: 312 s, completed Dec 4, 2013 1:02:02 PM
rm: cannot change to directory `/d/spark/dist/bin': Permission denied
rm: cannot change to directory `/d/spark/dist/python': Permission denied
rm: cannot remove directory `/d/spark/dist': Directory not empty
cp: cannot stat `/d/spark/conf/*.template': No such file or directory
cp: cannot create regular file `/d/spark/dist/bin/compute-classpath.cmd':
Permission denied
cp: cannot create regular file `/d/spark/dist/bin/compute-classpath.sh':
Permission denied
cp: cannot create regular file `/d/spark/dist/bin/slaves.sh': Permission denied
cp: cannot create regular file `/d/spark/dist/bin/spark-config.sh': Permission
denied
cp: cannot create regular file `/d/spark/dist/bin/spark-daemon.sh': Permission
denied
cp: cannot create regular file `/d/spark/dist/bin/spark-daemons.sh': Permission
denied
cp: cannot create regular file `/d/spark/dist/bin/start-all.sh': Permission
denied
cp: cannot create regular file `/d/spark/dist/bin/start-master.sh': Permission
denied
cp: cannot create regular file `/d/spark/dist/bin/start-slave.sh': Permission
denied
.
.
.
cp: cannot create regular file `/d/spark/dist/python/pyspark/worker.py':
Permission denied
cp: cannot create regular file `/d/spark/dist/python/pyspark/__init__.py':
Permission denied
cp: cannot create regular file `/d/spark/dist/python/run-tests': Permission
denied
cp: cannot create regular file `/d/spark/dist/python/test_support/hello.txt':
Permission denied
cp: cannot create regular file
`/d/spark/dist/python/test_support/userlib-0.1-py2.7.egg': Permission denied
cp: cannot create regular file
`/d/spark/dist/python/test_support/userlibrary.py': Permission denied
18. d:\spark> cd dist\bin
Access is denied.
--Adrian