Hi, everyone! I deployed spark in the yarn model cluster. I export  the
SPARK_MASTER_IP with an ip, and make
sure that all the spark configuration files use ip value in
SPARH_HOME/conf/*, and all the hadoop configuration files use ip value in
HADOOP_HOME/etc/*。I can success to submit spark job by bin/spark-submit
shell,  but get the tracking URL with the hostname format, just like
"tracking URL: http://hadoop-01:8088/proxy/application_1447305035557_0003/";.
My problem is how can i get the tracking URL with the ip style just like
"tracking URL:
http://192.168.0.100:8088/proxy/application_1447305035557_0003/";.

ps:
I check the linux's configuration file, and find the /etc/hosts recording
"192.168.0.100 hadoop-01", but if i remove this, i submit spark job failed.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/how-to-get-the-tracking-URL-with-ip-address-instead-of-hostname-in-yarn-cluster-model-tp25387.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to