[
https://issues.apache.org/jira/browse/HIVE-12650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15128361#comment-15128361
]
Xuefu Zhang commented on HIVE-12650:
------------------------------------
[~lirui], thanks for your analysis. Yeah, I saw the actually elapsed time is
very short, while the message says timeout 150s, which is very confusing.
[~vanzin], could you please explain a little bit the use of the two timeout?
Also, what timeout value does spark-submit use if the application cannot be
submitted?
[[email protected]], could you please reproduce the problem and provide
more info such as hive.log?
Thanks, folks!
> Increase default value of hive.spark.client.server.connect.timeout to exceeds
> spark.yarn.am.waitTime
> ----------------------------------------------------------------------------------------------------
>
> Key: HIVE-12650
> URL: https://issues.apache.org/jira/browse/HIVE-12650
> Project: Hive
> Issue Type: Bug
> Affects Versions: 1.1.1, 1.2.1
> Reporter: JoneZhang
> Assignee: Xuefu Zhang
>
> I think hive.spark.client.server.connect.timeout should be set greater than
> spark.yarn.am.waitTime. The default value for
> spark.yarn.am.waitTime is 100s, and the default value for
> hive.spark.client.server.connect.timeout is 90s, which is not good. We can
> increase it to a larger value such as 120s.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)