./bin/compute-classpath.sh fails with error:

$> jar -tf 
assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop1.0.4.jar
nonexistent/class/path
java.util.zip.ZipException: invalid CEN header (bad signature)
        at java.util.zip.ZipFile.open(Native Method)
        at java.util.zip.ZipFile.<init>(ZipFile.java:132)
        at java.util.zip.ZipFile.<init>(ZipFile.java:93)
        at sun.tools.jar.Main.list(Main.java:997)
        at sun.tools.jar.Main.run(Main.java:242)
        at sun.tools.jar.Main.main(Main.java:1167)

However, I both compiled the distribution and am running spark with Java 1.7;
$ java -version
        java version "1.7.0_75"
        OpenJDK Runtime Environment (IcedTea 2.5.4) (7u75-2.5.4-1~trusty1)
        OpenJDK 64-Bit Server VM (build 24.75-b04, mixed mode)
on a system running Ubuntu:
$ uname -srpov
Linux 3.13.0-44-generic #73-Ubuntu SMP Tue Dec 16 00:22:43 UTC 2014
x86_64 GNU/Linux
$ uname -srpo
Linux 3.13.0-44-generic x86_64 GNU/Linux

This problem was reproduced on Arch Linux:

$ uname -srpo
Linux 3.18.5-1-ARCH x86_64 GNU/Linux
with
$ java -version
java version "1.7.0_75"
OpenJDK Runtime Environment (IcedTea 2.5.4) (Arch Linux build
7.u75_2.5.4-1-x86_64)
OpenJDK 64-Bit Server VM (build 24.75-b04, mixed mode)

In both of these cases, the problem is not the java versioning;
neither system even has a java 6 installation. This seems like a false
positive to me in compute-classpath.sh.

When I comment out the relevant lines in compute-classpath.sh, the
scripts start-{master,slaves,...}.sh all run fine, and I have no
problem launching applications.

Could someone please offer some insight into this issue?

Thanks,
Mike

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to