Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/5085#discussion_r26985093
--- Diff: bin/spark-class ---
@@ -40,36 +40,24 @@ else
fi
fi
-# Look for the launcher. In non-release mode, add the compiled classes
directly to the classpath
-# instead of looking for a jar file.
-SPARK_LAUNCHER_CP=
-if [ -f $SPARK_HOME/RELEASE ]; then
- LAUNCHER_DIR="$SPARK_HOME/lib"
- num_jars="$(ls -1 "$LAUNCHER_DIR" | grep "^spark-launcher.*\.jar$" | wc
-l)"
- if [ "$num_jars" -eq "0" -a -z "$SPARK_LAUNCHER_CP" ]; then
- echo "Failed to find Spark launcher in $LAUNCHER_DIR." 1>&2
- echo "You need to build Spark before running this program." 1>&2
- exit 1
- fi
-
- LAUNCHER_JARS="$(ls -1 "$LAUNCHER_DIR" | grep "^spark-launcher.*\.jar$"
|| true)"
- if [ "$num_jars" -gt "1" ]; then
- echo "Found multiple Spark launcher jars in $LAUNCHER_DIR:" 1>&2
- echo "$LAUNCHER_JARS" 1>&2
- echo "Please remove all but one jar." 1>&2
- exit 1
- fi
-
- SPARK_LAUNCHER_CP="${LAUNCHER_DIR}/${LAUNCHER_JARS}"
-else
- LAUNCHER_DIR="$SPARK_HOME/launcher/target/scala-$SPARK_SCALA_VERSION"
- if [ ! -d "$LAUNCHER_DIR/classes" ]; then
- echo "Failed to find Spark launcher classes in $LAUNCHER_DIR." 1>&2
- echo "You need to build Spark before running this program." 1>&2
- exit 1
- fi
- SPARK_LAUNCHER_CP="$LAUNCHER_DIR/classes"
+# Find assembly jar
+SPARK_ASSEMBLY_JAR=
+ASSEMBLY_DIR="$SPARK_HOME/lib"
--- End diff --
Where are you looking for the assembly under
`assembly/target/scala-$SPARK_SCALA_VERSION`?
That's needed to not break dev builds.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]