[ https://issues.apache.org/jira/browse/HIVE-8029?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14127055#comment-14127055 ]
Brock Noland commented on HIVE-8029: ------------------------------------ I don't think nullscan is related as it's been failing for other runs. I created HIVE-8032 to fix that. > Remove reducers number configure in SparkTask[Spark Branch] > ----------------------------------------------------------- > > Key: HIVE-8029 > URL: https://issues.apache.org/jira/browse/HIVE-8029 > Project: Hive > Issue Type: Improvement > Components: Spark > Reporter: Chengxiang Li > Assignee: Chengxiang Li > Labels: Spark-M4 > Attachments: HIVE-8029.1-spark.patch > > > We do not need duplicated logic to configure reducers number in SparkTask, as > SetSparkReduceParallelism would always set reducers number in compiler phase. -- This message was sent by Atlassian JIRA (v6.3.4#6332)