Are you sitting behind a firewall and accessing a remote master machine? In
that case, have a look at this
http://spark.apache.org/docs/latest/configuration.html#networking, you
might want to fix few properties like spark.driver.host, spark.driver.host
etc.

Thanks
Best Regards

On Mon, Aug 3, 2015 at 7:46 AM, Angel Angel <areyouange...@gmail.com> wrote:

> Hello Sir,
>
> I have install the spark.
>
>
>
> The local  spark-shell is working fine.
>
>
>
> But whenever I tried the Master configuration I got some errors.
>
>
>
> When I run this command ;
>
> MASTER=spark://hadoopm0:7077 spark-shell
>
>
>
> I gets the errors likes;
>
>
>
> 15/07/27 21:17:26 INFO AppClient$ClientActor: Connecting to master
> spark://hadoopm0:7077...
>
> 15/07/27 21:17:46 ERROR SparkDeploySchedulerBackend: Application has been
> killed. Reason: All masters are unresponsive! Giving up.
>
> 15/07/27 21:17:46 WARN SparkDeploySchedulerBackend: Application ID is not
> initialized yet.
>
> 15/07/27 21:17:46 ERROR TaskSchedulerImpl: Exiting due to error from
> cluster scheduler: All masters are unresponsive! Giving up.
>
>
>
> Also I have attached the my screenshot of Master UI.
>
>
> Also i have tested using telnet command:
>
>
> it shows that hadoopm0 is connected
>
>
>
> Can you please give me some references, documentations or  how to solve
> this issue.
>
> Thanks in advance.
>
> Thanking You,
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

Reply via email to