Repository: spark
Updated Branches:
  refs/heads/master f7beae6da -> 231f97329


[SPARK-17318][TESTS] Fix ReplSuite replicating blocks of object with class 
defined in repl

## What changes were proposed in this pull request?

There are a lot of failures recently: 
http://spark-tests.appspot.com/tests/org.apache.spark.repl.ReplSuite/replicating%20blocks%20of%20object%20with%20class%20defined%20in%20repl

This PR just changed the persist level to `MEMORY_AND_DISK_2` to avoid blocks 
being evicted from memory.

## How was this patch tested?

Jenkins unit tests.

Author: Shixiong Zhu <shixi...@databricks.com>

Closes #14884 from zsxwing/SPARK-17318.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/231f9732
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/231f9732
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/231f9732

Branch: refs/heads/master
Commit: 231f973295129dca976f2e4a8222a63318d4aafe
Parents: f7beae6
Author: Shixiong Zhu <shixi...@databricks.com>
Authored: Tue Aug 30 20:04:52 2016 -0700
Committer: Shixiong Zhu <shixi...@databricks.com>
Committed: Tue Aug 30 20:04:52 2016 -0700

----------------------------------------------------------------------
 .../src/test/scala/org/apache/spark/repl/ReplSuite.scala           | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/231f9732/repl/scala-2.11/src/test/scala/org/apache/spark/repl/ReplSuite.scala
----------------------------------------------------------------------
diff --git 
a/repl/scala-2.11/src/test/scala/org/apache/spark/repl/ReplSuite.scala 
b/repl/scala-2.11/src/test/scala/org/apache/spark/repl/ReplSuite.scala
index 06b09f3..f1284b1 100644
--- a/repl/scala-2.11/src/test/scala/org/apache/spark/repl/ReplSuite.scala
+++ b/repl/scala-2.11/src/test/scala/org/apache/spark/repl/ReplSuite.scala
@@ -401,7 +401,7 @@ class ReplSuite extends SparkFunSuite {
       """
         |import org.apache.spark.storage.StorageLevel._
         |case class Foo(i: Int)
-        |val ret = sc.parallelize((1 to 100).map(Foo), 
10).persist(MEMORY_ONLY_2)
+        |val ret = sc.parallelize((1 to 100).map(Foo), 
10).persist(MEMORY_AND_DISK_2)
         |ret.count()
         |sc.getExecutorStorageStatus.map(s => s.rddBlocksById(ret.id).size).sum
       """.stripMargin)


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to