You can add an internal ip to public hostname mapping in your /etc/hosts
file, if your forwarding is proper then it wouldn't be a problem there
after.
Thanks
Best Regards
On Tue, Mar 31, 2015 at 9:18 AM, anny9699 anny9...@gmail.com wrote:
Hi,
For security reasons, we added a server between
Hi Akhil,
I tried editing the /etc/hosts on the master and on the workers, and seems
it is not working for me.
I tried adding hostname internal-ip and it didn't work. I then tried
adding internal-ip hostname and it didn't work either. I guess I should
also edit the spark-env.sh file?
Thanks!
Did you try setting the SPARK_MASTER_IP parameter in spark-env.sh?
On 31.3.2015. 19:19, Anny Chen wrote:
Hi Akhil,
I tried editing the /etc/hosts on the master and on the workers, and
seems it is not working for me.
I tried adding hostname internal-ip and it didn't work. I then
tried
When you say you added internal-ip hostname, where you able to ping any
of these from the machine?
You could try setting SPARK_LOCAL_IP on all machines. But make sure you
will be able to bind to that host/ip specified there.
Thanks
Best Regards
On Tue, Mar 31, 2015 at 10:49 PM, Anny Chen
Thanks Petar and Akhil for the suggestion.
Actually I changed the SPARK_MASTER_IP to the internal-ip, deleted the
export SPARK_PUBLIC_DNS=xx line in the spark-env.sh and also edited
the /etc/hosts as Akhil suggested, and now it is working! However I don't
know which change actually makes it