Github user jiangxb1987 commented on the issue:

    https://github.com/apache/spark/pull/20091
  
    @mridulm Actually you have a good point on that we should be extremely 
careful in making change s related to an exposed interface, now I'll narrow 
down the scope of this PR to:
    ```
    If the safety check fails and `spark.default.parallelism` is set, and the 
value of defaultParallelism is smaller than the number of existing partitioner, 
then we should still use the existing partitioner. 
    ```
    Does this make more sense?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to