Can you paste the contents of your spark-env.sh file? Also would be good to
have a look at the /etc/hosts file. Cannot bind to the given ip address can
be resolved if you put the hostname instead of the ip address. Also make
sure the configuration (conf directory) across your cluster have the same
contents.

Thanks
Best Regards

On Mon, Oct 26, 2015 at 10:48 AM, Rohith P <rparameshw...@couponsinc.com>
wrote:

> No.. the ./sbin/start-master.sh --ip option did not work... It is still the
> same error
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Unable-to-run-applications-on-spark-in-standalone-cluster-mode-tp14683p14779.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to