Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/704#discussion_r12462341
--- Diff: bin/run-example ---
@@ -49,46 +31,31 @@ fi
if [[ -z $SPARK_EXAMPLES_JAR ]]; then
echo "Failed to find Spark examples assembly in $FWDIR/lib or
$FWDIR/examples/target" >&2
- echo "You need to build Spark with sbt/sbt assembly before running this
program" >&2
+ echo "You need to build Spark before running this program" >&2
exit 1
fi
+SPARK_EXAMPLES_JAR_REL=${SPARK_EXAMPLES_JAR#$FWDIR/}
-# Since the examples JAR ideally shouldn't include spark-core (that
dependency should be
-# "provided"), also add our standard Spark classpath, built using
compute-classpath.sh.
-CLASSPATH=`$FWDIR/bin/compute-classpath.sh`
-CLASSPATH="$SPARK_EXAMPLES_JAR:$CLASSPATH"
-
-if $cygwin; then
- CLASSPATH=`cygpath -wp $CLASSPATH`
- export SPARK_EXAMPLES_JAR=`cygpath -w $SPARK_EXAMPLES_JAR`
-fi
-
-# Find java binary
-if [ -n "${JAVA_HOME}" ]; then
- RUNNER="${JAVA_HOME}/bin/java"
-else
- if [ `command -v java` ]; then
- RUNNER="java"
- else
- echo "JAVA_HOME is not set" >&2
- exit 1
- fi
-fi
+EXAMPLE_CLASS="<example-class>"
+EXAMPLE_ARGS="[<example args>]"
+EXAMPLE_MASTER=${MASTER:-"<master>"}
-# Set JAVA_OPTS to be able to load native libraries and to set heap size
-JAVA_OPTS="$SPARK_JAVA_OPTS"
-# Load extra JAVA_OPTS from conf/java-opts, if it exists
-if [ -e "$FWDIR/conf/java-opts" ] ; then
- JAVA_OPTS="$JAVA_OPTS `cat $FWDIR/conf/java-opts`"
+if [ -n "$1" ]; then
+ EXAMPLE_CLASS="$1"
+ shift
fi
-export JAVA_OPTS
-if [ "$SPARK_PRINT_LAUNCH_COMMAND" == "1" ]; then
- echo -n "Spark Command: "
- echo "$RUNNER" -cp "$CLASSPATH" $JAVA_OPTS "$@"
- echo "========================================"
- echo
+if [ -n "$1" ]; then
+ EXAMPLE_ARGS="$@"
fi
-exec "$RUNNER" -cp "$CLASSPATH" $JAVA_OPTS "$@"
+echo "NOTE: This script has been replaced with ./bin/spark-submit. Please
run:" >&2
+echo
+echo "./bin/spark-submit \\" >&2
--- End diff --
Yes, I completely agree. We dont want the user to have to type out this
more complicated stuff with library path and all. Just
bin/run-example <example params>
In fact, now that all the examples are inside spark.examples. package, we
can try to make it even simpler. To run SparkPi, one should be able to just say
./bin/run-example SparkPi
That would very simple!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---