Github user rdblue commented on a diff in the pull request:

    https://github.com/apache/spark/pull/11242#discussion_r59062676
  
    --- Diff: core/src/main/scala/org/apache/spark/rdd/UnionRDD.scala ---
    @@ -62,8 +62,14 @@ class UnionRDD[T: ClassTag](
         var rdds: Seq[RDD[T]])
       extends RDD[T](sc, Nil) {  // Nil since we implement getDependencies
     
    +  // visible for testing
    +  private[spark] val isPartitionEvalParallel: Boolean =
    +    rdds.length > conf.getInt("spark.rdd.parallelListingThreshold", 10)
    --- End diff --
    
    Concurrency bugs are hard to trigger and I disagree that other Spark usage 
would have exposed them already. I think this is the main point where we 
disagree.
    
    Assuming that the user has tracked the problem down, deploying the fix may 
be the hardest part. Many customers can't simply make changes and push a new 
version without sending code through QA or some other validation. A runtime 
config change is much easier.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to