Hi I am successful create Spark cluster in openStack. I want to create spark cluster in different openStack sites.
In openstack, if you create instance, it only know it's private ip ( like 10.x.y.z ), it will not know it have public IP for itself. ( I try to export SPARK_MASTER_IP=xxx.xxx.xxx.xxx with public ip, but log show it can't bind it ) so, the port 7070 is listen on private IP address , and refuse connect for other network ip $ netstat -tupln (Not all processes could be identified, non-owned process info will not be shown, you would have to be root to see it all.) Active Internet connections (only servers) Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name tcp 0 0 0.0.0.0:21 0.0.0.0:* LISTEN - tcp 0 0 0.0.0.0:50070 0.0.0.0:* LISTEN 12921/java tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN - tcp 0 0 10.103.0.67:9000 0.0.0.0:* LISTEN 12921/java tcp 0 0 0.0.0.0:50090 0.0.0.0:* LISTEN 13163/java tcp6 0 0 :::22 :::* LISTEN - tcp6 0 0 10.103.0.67:7077 :::* LISTEN 25488/java tcp6 0 0 :::8080 :::* LISTEN 25488/java tcp6 0 0 127.255.255.1:6066 :::* LISTEN 25488/java udp 0 0 0.0.0.0:64062 0.0.0.0:* - udp 0 0 0.0.0.0:68 0.0.0.0:* - udp6 0 0 :::59067 :::* Is there any possible listen 0.0.0.0:7077 with spark? I want the other slaves to connect master with public IP in openstack. Thanks for kindly help Max -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Is-it-possible-to-create-spark-cluster-in-different-network-tp24524.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org