[ 
https://issues.apache.org/jira/browse/HIVE-8029?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Brock Noland updated HIVE-8029:
-------------------------------
       Resolution: Fixed
    Fix Version/s: spark-branch
           Status: Resolved  (was: Patch Available)

Thank you Chengxiang! I have committed this to spark. Note in the commit I 
actually said Rui since I was just reviewing HIVE-8017. I apologize for this 
mistake, but since the JIRA is assigned to you, you will still get the 
appropriate accreditation for the patch.

> Remove reducers number configure in SparkTask [Spark Branch]
> ------------------------------------------------------------
>
>                 Key: HIVE-8029
>                 URL: https://issues.apache.org/jira/browse/HIVE-8029
>             Project: Hive
>          Issue Type: Improvement
>          Components: Spark
>            Reporter: Chengxiang Li
>            Assignee: Chengxiang Li
>              Labels: Spark-M4
>             Fix For: spark-branch
>
>         Attachments: HIVE-8029.1-spark.patch
>
>
> We do not need duplicated logic to configure reducers number in SparkTask, as 
> SetSparkReduceParallelism would always set reducers number in compiler phase.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to