[
https://issues.apache.org/jira/browse/HIVE-16799?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Xuefu Zhang updated HIVE-16799:
-------------------------------
Resolution: Fixed
Fix Version/s: 3.0.0
Release Note: Document the new configuration.
Status: Resolved (was: Patch Available)
Committed to master. Thanks to Rui for the review.
> Control the max number of task for a stage in a spark job
> ---------------------------------------------------------
>
> Key: HIVE-16799
> URL: https://issues.apache.org/jira/browse/HIVE-16799
> Project: Hive
> Issue Type: Improvement
> Reporter: Xuefu Zhang
> Assignee: Xuefu Zhang
> Labels: TODOC3.0
> Fix For: 3.0.0
>
> Attachments: HIVE-16799.1.patch, HIVE-16799.patch
>
>
> HIVE-16552 gives admin an option to control the maximum number of tasks a
> Spark job may have. However, this may not be sufficient as this tends to
> penalize jobs that have many stages while favoring jobs that has fewer
> stages. Ideally, we should also limit the number of tasks in a stage, which
> is closer to the maximum number of mappers or reducers in a MR job.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)