[ 
https://issues.apache.org/jira/browse/HIVE-12650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15129761#comment-15129761
 ] 

Xuefu Zhang commented on HIVE-12650:
------------------------------------

I see. I think that's what the [[email protected]] experienced as well. 
Killing spark-submit doesn't cancel AM request. When AM is finally launched, it 
tries to connect back to Hive and gets refused. As a result, it quickly errors 
out. (However, on spark side, the message, saying "spark context initialization 
times out in xxx seconds", is very confusing.) I'm not sure if we can do 
anything here.

Nevertheless, it seems spark.yarn.am.waitTime isn't relevant after all.

> Increase default value of hive.spark.client.server.connect.timeout to exceeds 
> spark.yarn.am.waitTime
> ----------------------------------------------------------------------------------------------------
>
>                 Key: HIVE-12650
>                 URL: https://issues.apache.org/jira/browse/HIVE-12650
>             Project: Hive
>          Issue Type: Bug
>    Affects Versions: 1.1.1, 1.2.1
>            Reporter: JoneZhang
>            Assignee: Xuefu Zhang
>
> I think hive.spark.client.server.connect.timeout should be set greater than 
> spark.yarn.am.waitTime. The default value for 
> spark.yarn.am.waitTime is 100s, and the default value for 
> hive.spark.client.server.connect.timeout is 90s, which is not good. We can 
> increase it to a larger value such as 120s.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to