Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/627#discussion_r12289304
  
    --- Diff: bin/compute-classpath.sh ---
    @@ -54,6 +60,14 @@ else
       else
         ASSEMBLY_JAR=`ls "$ASSEMBLY_DIR"/spark-assembly*hadoop*.jar`
       fi
    +  jar_error_check=$($JAR_CMD -tf $ASSEMBLY_JAR 
org/apache/spark/SparkContext 2>&1)
    +  if [[ "$jar_error_check" =~ "invalid CEN header" ]]; then
    +    echo "Loading Spark jar with '$JAR_CMD' failed. "
    +    echo "This is likely because Spark was compiled with Java 7 and run "
    +    echo "with Java 6. (see SPARK-1703). Please use Java 7 to run Spark "
    +    echo "or build Spark with Java 6."
    +    exit 1
    +  fi
    --- End diff --
    
    I realize this is already merged, but it looks like the jar error check is 
only tested on the assembly jar (not on the reps assembly jar). It might be 
good to check it in both cases.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to