Yes, here it is I set it up like this: export STANDALONE_SPARK_MASTER_HOST=`hostname` export SPARK_MASTER_IP=$STANDALONE_SPARK_MASTER_HOST
### Let's run everything with JVM runtime, instead of Scala export SPARK_LAUNCH_WITH_SCALA=10.2 export SPARK_LIBRARY_PATH=${SPARK_HOME}/lib export SCALA_LIBRARY_PATH=${SPARK_HOME}/lib export SPARK_MASTER_WEBUI_PORT=18080 export SPARK_MASTER_PORT=7077 export SPARK_WORKER_PORT=7078 export SPARK_WORKER_WEBUI_PORT=18081 export SPARK_WORKER_DIR=/var/run/spark/work export SPARK_LOG_DIR=/var/log/spark if [ -n "$HADOOP_HOME" ]; then export SPARK_LIBRARY_PATH=$SPARK_LIBRARY_PATH:${HADOOP_HOME}/lib/native fi ### Comment above 2 lines and uncomment the following if ### you want to run with scala version, that is included with the package #export SCALA_HOME=${SCALA_HOME:-/usr/lib/spark/scala} #export PATH=$PATH:$SCALA_HOME/bin -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/help-tp4901p5001.html Sent from the Apache Spark User List mailing list archive at Nabble.com.