GitHub user jiangxb1987 opened a pull request:

    https://github.com/apache/spark/pull/20091

    [SPARK-22465] Update the number of partitions of default partitioner when 
defaultParallelism is set

    ## What changes were proposed in this pull request?
    
    #20002 purposed a way to safe check the default partitioner, however, if 
`spark.default.parallelism` is set, the defaultParallelism still could be 
smaller than the proper number of partitions for upstreams RDDs. This PR tries 
to extend the approach to address the condition when 
`spark.default.parallelism` is set.
    
    ## How was this patch tested?
    
    Add corresponding test cases in `PairRDDFunctionsSuite` and 
`PartitioningSuite`.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/jiangxb1987/spark partitioner

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/20091.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #20091
    
----
commit 4751463530be31b6cc69b1f5b13d5b409149bdab
Author: Xingbo Jiang <xingbo.jiang@...>
Date:   2017-12-27T14:10:41Z

    default partitioner when defaultParallelism is set

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to