Thank you Akhil will try this out. We are able to access the machines using the public IP and even the private as they are on our subnet.
Thanks Ankur On Oct 9, 2014 12:41 AM, "Akhil Das" <ak...@sigmoidanalytics.com> wrote: > You must be having those hostnames in your /etc/hosts file, if you are > not able to access it using the hostnames then you won't be able access it > with the IP address either i believe. > What are you trying to do here? running your eclipse locally and > connecting to your ec2 cluster? > > Thanks > Best Regards > > On Tue, Oct 7, 2014 at 3:36 AM, Ankur Srivastava < > ankur.srivast...@gmail.com> wrote: > >> Hi, >> >> I have started a Spark Cluster on EC2 using Spark Standalone cluster >> manager but spark is trying to identify the worker threads using the >> hostnames which are not accessible publicly. >> >> So when I try to submit jobs from eclipse it is failing, is there some >> way spark can use IP address instead of hostnames? >> >> I have used IP address in the slaves file. >> >> Thanks >> Ankur >> > >