Github user squito commented on a diff in the pull request:

    https://github.com/apache/spark/pull/6648#discussion_r32022015
  
    --- Diff: core/src/test/scala/org/apache/spark/shuffle/ShuffleSuite.scala 
---
    @@ -315,8 +320,115 @@ abstract class ShuffleSuite extends SparkFunSuite 
with Matchers with LocalSparkC
         assert(metrics.bytesWritten === metrics.byresRead)
         assert(metrics.bytesWritten > 0)
       }
    +
    +  def multipleAttemptConfs: Seq[(String, SparkConf)] = Seq("basic" -> conf)
    +
    +  multipleAttemptConfs.foreach { case (name, multipleAttemptConf) =>
    +    test("multiple attempts for one task: conf = " + name) {
    +      sc = new SparkContext("local", "test", multipleAttemptConf)
    --- End diff --
    
    This is the most important new test.  It makes sure that we can have 
multiple shuffle writers executing concurrently, and the output of both can be 
read.
    
    `multipleAttempConfs` is just to make sure we can test different code paths 
from different confs, in particular in the `Unsafe` shuffle.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to