Re: Unable to run applications on spark in standalone cluster mode

2015-11-02 Thread Rohith P
The contents of spark-env.sh is :
SPARK_MASTER_IP=marvin.spark.ins-01
SPARK_MASTER_PORT=7077
SPARK_MASTER_WEBUI_PORT=8080
SPARK_WORKER_WEBUI_PORT=8081
SPARK_WORKER_INSTANCES=1
SPARK_LOCAL_IP=marvin.spark.ins-01



The contents of etc/hosts is 
172.28.161.33   marvin.base.ins-01  ip-172-28-161-33

172.28.161.200  marvin.cassandra.ins-01 ip-172-28-161-200
172.28.161.201  marvin.cassandra.ins-02 ip-172-28-161-201

172.28.161.138  marvin.spark.ins-01 ip-172-28-161-138
172.28.161.139  marvin.spark.ins-02 ip-172-28-161-139
172.28.161.140  marvin.spark.ins-03 ip-172-28-161-140
172.28.161.141  marvin.spark.ins-04 ip-172-28-161-141





--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Unable-to-run-applications-on-spark-in-standalone-cluster-mode-tp14683p14871.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Unable to run applications on spark in standalone cluster mode

2015-11-01 Thread Akhil Das
Can you paste the contents of your spark-env.sh file? Also would be good to
have a look at the /etc/hosts file. Cannot bind to the given ip address can
be resolved if you put the hostname instead of the ip address. Also make
sure the configuration (conf directory) across your cluster have the same
contents.

Thanks
Best Regards

On Mon, Oct 26, 2015 at 10:48 AM, Rohith P 
wrote:

> No.. the ./sbin/start-master.sh --ip option did not work... It is still the
> same error
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Unable-to-run-applications-on-spark-in-standalone-cluster-mode-tp14683p14779.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>


Re: Unable to run applications on spark in standalone cluster mode

2015-10-25 Thread Rohith P
No.. the ./sbin/start-master.sh --ip option did not work... It is still the
same error




--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Unable-to-run-applications-on-spark-in-standalone-cluster-mode-tp14683p14779.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Unable to run applications on spark in standalone cluster mode

2015-10-19 Thread Jean-Baptiste Onofré

Hi Rohith,

Do you have multiple interfaces on the machine hosting the master ?

If so, can you try to force to the public interface using:

sbin/start-master.sh --ip xxx.xxx.xxx.xxx

Regards
JB

On 10/19/2015 02:05 PM, Rohith Parameshwara wrote:

Hi all,

 I am doing some experiments on spark standalone cluster
setup and I am facing the following issue:

I have a 4 node cluster setup. As per
http://spark.apache.org/docs/latest/spark-standalone.html#starting-a-cluster-manually
I tried to start the cluster with the scripts but, the slaves did not
start and gave permission denied error….My conf/slaves had a list of IP
addresses of the slaves… But Then I was able to start the worker nodes
by going to the respective slave machine and running the start-slave.sh
script with the master IP as parameter…. The webui showed the all the 3
worker nodes running as :

The application when submitted to the cluster will show in all the work
nodes in the webui…. But in the command line where is spark-submit
command was run, it gives this error periodically and continuously….

ERROR NettyTransport: failed to bind to /172.28.161.138:7077, shutting
down Netty transport

15/10/19 11:59:38 WARN Utils: Service 'sparkDriver' could not bind on
port 7077. Attempting port 7078.

15/10/19 11:59:38 WARN AppClient$ClientActor: Could not connect to
akka.tcp://sparkMaster@localhost:7077: akka.remote.InvalidAssociation:
Invalid address: akka.tcp://sparkMaster@localhost:7077….

My conf/spark-env.sh has:

SPARK_MASTER_IP=172.28.161.138

SPARK_MASTER_PORT=7077

SPARK_MASTER_WEBUI_PORT=8080

SPARK_WORKER_WEBUI_PORT=8081

SPARK_WORKER_INSTANCES=1

And I have also put this in all the slave nodes too….

The applications are running fine in –master local mode but in –master
spark://masterip:7077, it is not working….

Any type of help would be appreciated….. Thanks in advance



--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Unable to run applications on spark in standalone cluster mode

2015-10-19 Thread Rohith Parameshwara
Hi all,
I am doing some experiments on spark standalone cluster setup 
and I am facing the following issue:
I have a 4 node cluster setup. As per 
http://spark.apache.org/docs/latest/spark-standalone.html#starting-a-cluster-manually
 I tried to start the cluster with the scripts but, the slaves did not start 
and gave permission denied errorMy conf/slaves had a list of IP addresses 
of the slaves... But Then I was able to start the worker nodes by going to the 
respective slave machine and running the start-slave.sh script with the master 
IP as parameter The webui showed the all the 3 worker nodes running as :
[cid:image001.png@01D10A93.B59D7300]

The application when submitted to the cluster will show in all the work nodes 
in the webui But in the command line where is spark-submit command was run, 
it gives this error periodically and continuously
ERROR NettyTransport: failed to bind to /172.28.161.138:7077, shutting down 
Netty transport
15/10/19 11:59:38 WARN Utils: Service 'sparkDriver' could not bind on port 
7077. Attempting port 7078.
15/10/19 11:59:38 WARN AppClient$ClientActor: Could not connect to 
akka.tcp://sparkMaster@localhost:7077: akka.remote.InvalidAssociation: Invalid 
address: akka.tcp://sparkMaster@localhost:7077

My conf/spark-env.sh has:

SPARK_MASTER_IP=172.28.161.138
SPARK_MASTER_PORT=7077
SPARK_MASTER_WEBUI_PORT=8080
SPARK_WORKER_WEBUI_PORT=8081
SPARK_WORKER_INSTANCES=1
And I have also put this in all the slave nodes too

The applications are running fine in -master local mode but in -master 
spark://masterip:7077, it is not working
Any type of help would be appreciated. Thanks in advance