Github user tedyu commented on the pull request:

    https://github.com/apache/spark/pull/9384#issuecomment-152868318
  
    bq. what does this have to do with memory
    The answer is in SaveStageAndTaskInfo. See the following
    ```
       * A simple listener that saves all task infos and task metrics.
       */
      private class SaveStageAndTaskInfo extends SparkListener {
        val stageInfos = mutable.Map[StageInfo, Seq[(TaskInfo, TaskMetrics)]]()
        var taskInfoMetrics = mutable.Buffer[(TaskInfo, TaskMetrics)]()
    ```
    bq. Why is it valid to reduce the number of slices in this test?
    
    Let me dig into the first checkin of this test to see why 64 slices were 
used.
    As recent test runs showed, the invariant of the test is still true for 16 
slices.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to