Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/incubator-spark/pull/612#discussion_r9925460
  
    --- Diff: 
core/src/test/scala/org/apache/spark/util/collection/ExternalAppendOnlyMapSuite.scala
 ---
    @@ -83,6 +83,28 @@ class ExternalAppendOnlyMapSuite extends FunSuite with 
BeforeAndAfter with Local
           (3, Set[Int](30))))
       }
     
    +  test("insert with collision on hashCode Int.MaxValue") {
    +    val conf = new SparkConf(false)
    +    sc = new SparkContext("local", "test", conf)
    +
    --- End diff --
    
    Looks like this test currently does not induce spilling. I would set up the 
memory constraints as follows:
    
    val conf = new SparkConf()
    conf.set("spark.shuffle.memoryFraction", "0.001")
    sc = new SparkContext("local-cluster[1,1,512]", "test", conf)
    
    (Note that in this test it is crucial for SparkConf to take in no 
arguments. This is a workaround for the hacky way we currently pass in 
environment variables in the tests)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. To do so, please top-post your response.
If your project does not have this feature enabled and wishes so, or if the
feature is enabled but not working, please contact infrastructure at
infrastruct...@apache.org or file a JIRA ticket with INFRA.
---

Reply via email to