[ 
https://issues.apache.org/jira/browse/SPARK-6135?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-6135:
-----------------------------
    Comment: was deleted

(was: Sorry I didn't make it clearly. 
SPARK_LOCAL_HOSTNAME can solve this problem, so I think it need to be 
documented in official introduction.
And for my problem, what I want to say is in some companies, people may use ip 
address more than hostname, so a machine's hostname(especially for vm machines) 
may be like this "sjs_1_2". I said this hostname is invalid because this 
hostname cannot be used to remotely connect(eg. "ping sjs_1_2" ). But spark get 
this hostname and use it. So we get a error like this "Failed to connect to 
driver at sjs_1_2:xxxx". So I think a check of hostname is needed, and a valid 
hostname may end up with "org","com","edu",etc. )

> No checks of illegal hostname when runing spark on yarn.
> --------------------------------------------------------
>
>                 Key: SPARK-6135
>                 URL: https://issues.apache.org/jira/browse/SPARK-6135
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 1.2.1
>         Environment: spark on yarn cluster. 
>            Reporter: Xia Hu
>              Labels: spark-submit, yarn-client
>         Attachments: check_hostname.patch
>
>
> I submit spark application to yarn cluster from a spark client machine, then 
> I find this problem. When using spark on yarn-client model, driver is running 
> in client, then the yarn cluster need to remotely connect to driver. But 
> spark use InetAddress to read out the hostname of client machine,without 
> checking if the hostname is legal or useful. So in my condition, some client 
> machine have hostnames like "sjs_1_2", then this application fail because of 
> cannot connect to driver on "sjs_1_2".
> I suppose there should be a check for if a hostname is legal, and if not, 
> using the IP instead. 
> And for this problem, I found an env. configuration  "SPARK_LOCAL_HOSTNAME" 
> can be used. if I set "SPARK_LOCAL_HOSTNAME" to be  IP address in 
> spark-env.sh, then this problem is solved. But it seems this configuration 
> isn't introduced in any introductions or references  and I found it when 
> reading codes. 
> But I still think a check of if hostname is illegal is needed, 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to