[
https://issues.apache.org/jira/browse/SPARK-43586?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17725185#comment-17725185
]
Snoot.io commented on SPARK-43586:
----------------------------------
User 'LuciferYang' has created a pull request for this issue:
https://github.com/apache/spark/pull/41230
> There will be many invalid tasks when `Range.numSlices` > `Range.numElements`
> -----------------------------------------------------------------------------
>
> Key: SPARK-43586
> URL: https://issues.apache.org/jira/browse/SPARK-43586
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 3.5.0
> Reporter: Yang Jie
> Priority: Minor
> Attachments: image-2023-05-19-13-01-19-589.png
>
>
> For example, start a spark shell with `--master "local[100]"`, then run
> `spark.range(10).map(_ + 1).reduce(_ + _)`, there will be 100 tasks in the
> job, although there are only 10 elements in the Range:
> !image-2023-05-19-13-01-19-589.png|width=733,height=203!
>
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]