I'm trying to use OpenJDK 7 with Spark 1.3.0 and noticed that the 
compute-classpath.sh script is not adding the datanucleus jars to the classpath 
because compute-classpath.sh is assuming to find the jar command in 
$JAVA_HOME/bin/jar, which does not exist for OpenJDK.  Is this an issue anybody 
else has run into?  Would it be possible to use the unzip command instead?

The fact that $JAVA_HOME/bin/jar is missing also breaks the check that ensures 
that Spark was built with a compatible version of java to the one being used to 
launch spark.  The unzip tool of course wouldn't work for this, but there's 
probably another easy alternative to $JAVA_HOME/bin/jar.

~ Jonathan Kelly

Reply via email to