Revert change to spark-class

Also adds comment about how to configure for FaultToleranceTest.


Project: http://git-wip-us.apache.org/repos/asf/incubator-spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-spark/commit/749233b8
Tree: http://git-wip-us.apache.org/repos/asf/incubator-spark/tree/749233b8
Diff: http://git-wip-us.apache.org/repos/asf/incubator-spark/diff/749233b8

Branch: refs/heads/master
Commit: 749233b869da188920d8d72af7b82e586993d17c
Parents: 1cd57cd
Author: Aaron Davidson <aa...@databricks.com>
Authored: Tue Oct 8 11:41:52 2013 -0700
Committer: Aaron Davidson <aa...@databricks.com>
Committed: Tue Oct 8 11:41:52 2013 -0700

----------------------------------------------------------------------
 .../scala/org/apache/spark/deploy/FaultToleranceTest.scala  | 9 +++++++--
 spark-class                                                 | 2 +-
 2 files changed, 8 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-spark/blob/749233b8/core/src/main/scala/org/apache/spark/deploy/FaultToleranceTest.scala
----------------------------------------------------------------------
diff --git 
a/core/src/main/scala/org/apache/spark/deploy/FaultToleranceTest.scala 
b/core/src/main/scala/org/apache/spark/deploy/FaultToleranceTest.scala
index 8bac62b..668032a 100644
--- a/core/src/main/scala/org/apache/spark/deploy/FaultToleranceTest.scala
+++ b/core/src/main/scala/org/apache/spark/deploy/FaultToleranceTest.scala
@@ -36,11 +36,16 @@ import org.apache.spark.deploy.master.RecoveryState
 
 /**
  * This suite tests the fault tolerance of the Spark standalone scheduler, 
mainly the Master.
+ * In order to mimic a real distributed cluster more closely, Docker is used.
  * Execute using
  * ./spark-class org.apache.spark.deploy.FaultToleranceTest
  *
- * In order to mimic a real distributed cluster more closely, Docker is used.
- * Unfortunately, this dependency means that the suite cannot be run 
automatically without a
+ * Make sure that that the environment includes the following properties in 
SPARK_DAEMON_JAVA_OPTS:
+ *   - spark.deploy.recoveryMode=ZOOKEEPER
+ *   - spark.deploy.zookeeper.url=172.17.42.1:2181
+ * Note that 172.17.42.1 is the default docker ip for the host and 2181 is the 
default ZK port.
+ *
+ * Unfortunately, due to the Docker dependency this suite cannot be run 
automatically without a
  * working installation of Docker. In addition to having Docker, the following 
are assumed:
  *   - Docker can run without sudo (see 
http://docs.docker.io/en/latest/use/basics/)
  *   - The docker images tagged spark-test-master and spark-test-worker are 
built from the

http://git-wip-us.apache.org/repos/asf/incubator-spark/blob/749233b8/spark-class
----------------------------------------------------------------------
diff --git a/spark-class b/spark-class
index f678d5e..e111ef6 100755
--- a/spark-class
+++ b/spark-class
@@ -41,7 +41,7 @@ if [ "$1" = "org.apache.spark.deploy.master.Master" -o "$1" = 
"org.apache.spark.
   SPARK_MEM=${SPARK_DAEMON_MEMORY:-512m}
   SPARK_DAEMON_JAVA_OPTS="$SPARK_DAEMON_JAVA_OPTS 
-Dspark.akka.logLifecycleEvents=true"
   # Do not overwrite SPARK_JAVA_OPTS environment variable in this script
-  OUR_JAVA_OPTS="$SPARK_JAVA_OPTS $SPARK_DAEMON_JAVA_OPTS"   # Empty by default
+  OUR_JAVA_OPTS="$SPARK_DAEMON_JAVA_OPTS"   # Empty by default
 else
   OUR_JAVA_OPTS="$SPARK_JAVA_OPTS"
 fi

Reply via email to