[
https://issues.apache.org/jira/browse/HIVE-13066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15158323#comment-15158323
]
Xuefu Zhang commented on HIVE-13066:
------------------------------------
Hi [~lirui], thanks for working on this. The change seems easy to understand.
However, the original comment above the line you changed seems a little
concerning. Do you think if there is any problem if attempt number is global to
the Spark context? Were you able to reproduce the issue? If all is good, we
might want to update the comment to reflect what actually happens. Thanks.
> Hive on Spark gives incorrect results when speculation is on
> ------------------------------------------------------------
>
> Key: HIVE-13066
> URL: https://issues.apache.org/jira/browse/HIVE-13066
> Project: Hive
> Issue Type: Bug
> Components: Spark
> Reporter: Rui Li
> Assignee: Rui Li
> Attachments: HIVE-13066.1.patch
>
>
> The issue is reported by users. One possible reason is that we always append
> 0 as the attempt ID for each task so that hive won't be able to distinguish
> between speculative tasks and original ones.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)