Re: org.jboss.netty.channel.ChannelException: Failed to bind to: master/1xx.xx..xx:0

2014-07-02 Thread MEETHU MATHEW
The problem is resolved.I have added SPARK_LOCAL_IP=master in both slaves 
also.When i changed this my slaves are working.
Thank you all for your suggestions
 
Thanks & Regards, 
Meethu M


On Wednesday, 2 July 2014 10:22 AM, Aaron Davidson  wrote:
 


In your spark-env.sh, do you happen to set SPARK_PUBLIC_DNS or something of 
that kin? This error suggests the worker is trying to bind a server on the 
master's IP, which clearly doesn't make sense





On Mon, Jun 30, 2014 at 11:59 PM, MEETHU MATHEW  wrote:

Hi,
>
>
>I did netstat -na | grep 192.168.125.174 and its showing 192.168.125.174:7077 
>LISTEN(after starting master)
>
>
>I tried to execute the following script from the slaves manually but it ends 
>up with the same exception and log.This script is internally executing the 
>java command.
> /usr/local/spark-1.0.0/sbin/start-slave.sh 1 spark://192.168.125.174:7077
>In this case netstat is showing any connection established to master:7077.
>
>
>When we manually execute the java command,the connection is getting 
>established to master.
>
>
>Thanks & Regards, 
>Meethu M
>
>
>
>On Monday, 30 June 2014 6:38 PM, Akhil Das  wrote:
> 
>
>
>Are you sure you have this ip 192.168.125.174 bind for that machine? (netstat 
>-na | grep 192.168.125.174)
>
>
>Thanks
>Best Regards
>
>
>On Mon, Jun 30, 2014 at 5:34 PM, MEETHU MATHEW  wrote:
>
>Hi all,
>>
>>
>>I reinstalled spark,reboot the system,but still I am not able to start the 
>>workers.Its throwing the following exception:
>>
>>
>>Exception in thread "main" org.jboss.netty.channel.ChannelException: Failed 
>>to bind to: master/192.168.125.174:0
>>
>>
>>I doubt the problem is with 192.168.125.174:0. Eventhough the command 
>>contains master:7077,why its showing 0 in the log.
>>
>>
>>java -cp 
>>::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
>> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
>>org.apache.spark.deploy.worker.Worker spark://master:7077
>>
>>
>>Can somebody tell me  a solution.
>> 
>>Thanks & Regards, 
>>Meethu M
>>
>>
>>
>>On Friday, 27 June 2014 4:28 PM, MEETHU MATHEW  wrote:
>> 
>>
>>
>>Hi,
>>ya I tried setting another PORT also,but the same problem..
>>master is set in etc/hosts
>> 
>>Thanks & Regards, 
>>Meethu M
>>
>>
>>
>>On Friday, 27 June 2014 3:23 PM, Akhil Das  wrote:
>> 
>>
>>
>>tha's strange, did you try setting the master port to something else (use 
>>SPARK_MASTER_PORT).
>>
>>
>>Also you said you are able to start it from the java commandline
>>
>>
>>java -cp 
>>::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
>> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
>>org.apache.spark.deploy.worker.Worker spark://:master:7077
>>
>>
>>
>>What is the master ip specified here? is it like you have entry for master in 
>>the /etc/hosts? 
>>
>>
>>Thanks
>>Best Regards
>>
>>
>>On Fri, Jun 27, 2014 at 3:09 PM, MEETHU MATHEW  wrote:
>>
>>Hi Akhil,
>>>
>>>
>>>I am running it in a LAN itself..The IP of the master is given correctly.
>>> 
>>>Thanks & Regards, 
>>>Meethu M
>>>
>>>
>>>
>>>On Friday, 27 June 2014 2:51 PM, Akhil Das  
>>>wrote:
>>> 
>>>
>>>
>>>why is it binding to port 0? 192.168.125.174:0 :/
>>>
>>>
>>>Check the ip address of that master machine (ifconfig) looks like the ip 
>>>address has been changed (hoping you are running this machines on a LAN)
>>>
>>>
>>>Thanks
>>>Best Regards
>>>
>>>
>>>On Fri, Jun 27, 2014 at 12:00 PM, MEETHU MATHEW  
>>>wrote:
>>>
>>>Hi all,


My Spark(Standalone mode) was running fine till yesterday.But now I am 
getting  the following exeception when I am running start-slaves.sh or 
start-all.sh


slave3: failed to launch org.apache.spark.deploy.worker.Worker:
slave3:   at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
slave3:   at java.lang.Thread.run(Thread.java:662)


The log files has the following lines.


14/06/27 11:06:30 INFO SecurityManager: Using Spark's default log4j 
profile: org/apache/spark/log4j-defaults.properties
14/06/27 11:06:30 INFO SecurityManager: Changing view acls to: hduser
14/06/27 11:06:30 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(hduser)
14/06/27 11:06:30 INFO Slf4jLogger: Slf4jLogger started
14/06/27 11:06:30 INFO Remoting: Starting remoting
Exception in thread "main" org.jboss.netty.channel.ChannelException: Failed 
to bind to: master/192.168.125.174:0
at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
...
Caused by: java.net.BindException: Cannot assign requested address
...
I saw the same error reported before and have tried the following solutions.


Set the variable SPARK_LOCAL_IP ,Changed the SPARK_MASTER_PORT to a 
di

Re: org.jboss.netty.channel.ChannelException: Failed to bind to: master/1xx.xx..xx:0

2014-07-01 Thread Aaron Davidson
In your spark-env.sh, do you happen to set SPARK_PUBLIC_DNS or something of
that kin? This error suggests the worker is trying to bind a server on the
master's IP, which clearly doesn't make sense



On Mon, Jun 30, 2014 at 11:59 PM, MEETHU MATHEW 
wrote:

> Hi,
>
> I did netstat -na | grep 192.168.125.174 and its showing
> 192.168.125.174:7077 LISTEN(after starting master)
>
> I tried to execute the following script from the slaves manually but it
> ends up with the same exception and log.This script is internally executing
> the java command.
>  /usr/local/spark-1.0.0/sbin/start-slave.sh 1 spark://192.168.125.174:7077
> In this case netstat is showing any connection established to master:7077.
>
> When we manually execute the java command,the connection is getting
> established to master.
>
> Thanks & Regards,
> Meethu M
>
>
>   On Monday, 30 June 2014 6:38 PM, Akhil Das 
> wrote:
>
>
>  Are you sure you have this ip 192.168.125.174  
> bind
> for that machine? (netstat -na | grep 192.168.125.174
> )
>
> Thanks
> Best Regards
>
>
> On Mon, Jun 30, 2014 at 5:34 PM, MEETHU MATHEW 
> wrote:
>
> Hi all,
>
> I reinstalled spark,reboot the system,but still I am not able to start the
> workers.Its throwing the following exception:
>
> Exception in thread "main" org.jboss.netty.channel.ChannelException:
> Failed to bind to: master/192.168.125.174:0
>
> I doubt the problem is with 192.168.125.174:0. Eventhough the command
> contains master:7077,why its showing 0 in the log.
>
> java -cp
> ::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m
> org.apache.spark.deploy.worker.Worker spark://master:7077
>
> Can somebody tell me  a solution.
>
> Thanks & Regards,
> Meethu M
>
>
>   On Friday, 27 June 2014 4:28 PM, MEETHU MATHEW 
> wrote:
>
>
>  Hi,
> ya I tried setting another PORT also,but the same problem..
> master is set in etc/hosts
>
> Thanks & Regards,
> Meethu M
>
>
>   On Friday, 27 June 2014 3:23 PM, Akhil Das 
> wrote:
>
>
> tha's strange, did you try setting the master port to something else (use
> SPARK_MASTER_PORT).
>
> Also you said you are able to start it from the java commandline
>
> java -cp ::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/
> assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m
> -Xmx512m org.apache.spark.deploy.worker.Worker spark://:*master*:7077
>
> What is the master ip specified here? is it like you have entry for
> *master* in the /etc/hosts?
>
> Thanks
> Best Regards
>
>
> On Fri, Jun 27, 2014 at 3:09 PM, MEETHU MATHEW 
> wrote:
>
> Hi Akhil,
>
> I am running it in a LAN itself..The IP of the master is given correctly.
>
> Thanks & Regards,
> Meethu M
>
>
>   On Friday, 27 June 2014 2:51 PM, Akhil Das 
> wrote:
>
>
>  why is it binding to port 0? 192.168.125.174:0 :/
>
> Check the ip address of that master machine (ifconfig) looks like the ip
> address has been changed (hoping you are running this machines on a LAN)
>
> Thanks
> Best Regards
>
>
> On Fri, Jun 27, 2014 at 12:00 PM, MEETHU MATHEW 
> wrote:
>
> Hi all,
>
> My Spark(Standalone mode) was running fine till yesterday.But now I am
> getting  the following exeception when I am running start-slaves.sh or
> start-all.sh
>
> slave3: failed to launch org.apache.spark.deploy.worker.Worker:
> slave3:   at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
> slave3:   at java.lang.Thread.run(Thread.java:662)
>
> The log files has the following lines.
>
> 14/06/27 11:06:30 INFO SecurityManager: Using Spark's default log4j
> profile: org/apache/spark/log4j-defaults.properties
> 14/06/27 11:06:30 INFO SecurityManager: Changing view acls to: hduser
> 14/06/27 11:06:30 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users with view permissions: Set(hduser)
> 14/06/27 11:06:30 INFO Slf4jLogger: Slf4jLogger started
> 14/06/27 11:06:30 INFO Remoting: Starting remoting
> Exception in thread "main" org.jboss.netty.channel.ChannelException:
> Failed to bind to: master/192.168.125.174:0
>  at
> org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
> ...
> Caused by: java.net.BindException: Cannot assign requested address
>  ...
> I saw the same error reported before and have tried the following
> solutions.
>
> Set the variable SPARK_LOCAL_IP ,Changed the SPARK_MASTER_PORT to a
> different number..But nothing is working.
>
> When I try to start the worker from the respective machines using the
> following java command,its running without any exception
>
> java -cp
> ::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m
> org.apache.spark.d

Re: org.jboss.netty.channel.ChannelException: Failed to bind to: master/1xx.xx..xx:0

2014-07-01 Thread MEETHU MATHEW
Hi,

I did netstat -na | grep 192.168.125.174 and its showing 192.168.125.174:7077 
LISTEN(after starting master)

I tried to execute the following script from the slaves manually but it ends up 
with the same exception and log.This script is internally executing the java 
command.
 /usr/local/spark-1.0.0/sbin/start-slave.sh 1 spark://192.168.125.174:7077
In this case netstat is showing any connection established to master:7077.

When we manually execute the java command,the connection is getting established 
to master.

Thanks & Regards, 
Meethu M


On Monday, 30 June 2014 6:38 PM, Akhil Das  wrote:
 


Are you sure you have this ip 192.168.125.174 bind for that machine? (netstat 
-na | grep 192.168.125.174)


Thanks
Best Regards


On Mon, Jun 30, 2014 at 5:34 PM, MEETHU MATHEW  wrote:

Hi all,
>
>
>I reinstalled spark,reboot the system,but still I am not able to start the 
>workers.Its throwing the following exception:
>
>
>Exception in thread "main" org.jboss.netty.channel.ChannelException: Failed to 
>bind to: master/192.168.125.174:0
>
>
>I doubt the problem is with 192.168.125.174:0. Eventhough the command contains 
>master:7077,why its showing 0 in the log.
>
>
>java -cp 
>::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
>org.apache.spark.deploy.worker.Worker spark://master:7077
>
>
>Can somebody tell me  a solution.
> 
>Thanks & Regards, 
>Meethu M
>
>
>
>On Friday, 27 June 2014 4:28 PM, MEETHU MATHEW  wrote:
> 
>
>
>Hi,
>ya I tried setting another PORT also,but the same problem..
>master is set in etc/hosts
> 
>Thanks & Regards, 
>Meethu M
>
>
>
>On Friday, 27 June 2014 3:23 PM, Akhil Das  wrote:
> 
>
>
>tha's strange, did you try setting the master port to something else (use 
>SPARK_MASTER_PORT).
>
>
>Also you said you are able to start it from the java commandline
>
>
>java -cp 
>::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
>org.apache.spark.deploy.worker.Worker spark://:master:7077
>
>
>
>What is the master ip specified here? is it like you have entry for master in 
>the /etc/hosts? 
>
>
>Thanks
>Best Regards
>
>
>On Fri, Jun 27, 2014 at 3:09 PM, MEETHU MATHEW  wrote:
>
>Hi Akhil,
>>
>>
>>I am running it in a LAN itself..The IP of the master is given correctly.
>> 
>>Thanks & Regards, 
>>Meethu M
>>
>>
>>
>>On Friday, 27 June 2014 2:51 PM, Akhil Das  wrote:
>> 
>>
>>
>>why is it binding to port 0? 192.168.125.174:0 :/
>>
>>
>>Check the ip address of that master machine (ifconfig) looks like the ip 
>>address has been changed (hoping you are running this machines on a LAN)
>>
>>
>>Thanks
>>Best Regards
>>
>>
>>On Fri, Jun 27, 2014 at 12:00 PM, MEETHU MATHEW  
>>wrote:
>>
>>Hi all,
>>>
>>>
>>>My Spark(Standalone mode) was running fine till yesterday.But now I am 
>>>getting  the following exeception when I am running start-slaves.sh or 
>>>start-all.sh
>>>
>>>
>>>slave3: failed to launch org.apache.spark.deploy.worker.Worker:
>>>slave3:   at 
>>>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>>>slave3:   at java.lang.Thread.run(Thread.java:662)
>>>
>>>
>>>The log files has the following lines.
>>>
>>>
>>>14/06/27 11:06:30 INFO SecurityManager: Using Spark's default log4j profile: 
>>>org/apache/spark/log4j-defaults.properties
>>>14/06/27 11:06:30 INFO SecurityManager: Changing view acls to: hduser
>>>14/06/27 11:06:30 INFO SecurityManager: SecurityManager: authentication 
>>>disabled; ui acls disabled; users with view permissions: Set(hduser)
>>>14/06/27 11:06:30 INFO Slf4jLogger: Slf4jLogger started
>>>14/06/27 11:06:30 INFO Remoting: Starting remoting
>>>Exception in thread "main" org.jboss.netty.channel.ChannelException: Failed 
>>>to bind to: master/192.168.125.174:0
>>>at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
>>>...
>>>Caused by: java.net.BindException: Cannot assign requested address
>>>...
>>>I saw the same error reported before and have tried the following solutions.
>>>
>>>
>>>Set the variable SPARK_LOCAL_IP ,Changed the SPARK_MASTER_PORT to a 
>>>different number..But nothing is working.
>>>
>>>
>>>When I try to start the worker from the respective machines using the 
>>>following java command,its running without any exception
>>>
>>>
>>>java -cp 
>>>::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
>>> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
>>>org.apache.spark.deploy.worker.Worker spark://:master:7077
>>>
>>>
>>>
>>>Somebody please give a solution
>>> 
>>>Thanks & Regards, 
>>>Meethu M
>>
>>
>>
>
>
>
>
>

Re: org.jboss.netty.channel.ChannelException: Failed to bind to: master/1xx.xx..xx:0

2014-06-30 Thread Akhil Das
Are you sure you have this ip 192.168.125.174  bind
for that machine? (netstat -na | grep 192.168.125.174
)

Thanks
Best Regards


On Mon, Jun 30, 2014 at 5:34 PM, MEETHU MATHEW 
wrote:

> Hi all,
>
> I reinstalled spark,reboot the system,but still I am not able to start the
> workers.Its throwing the following exception:
>
> Exception in thread "main" org.jboss.netty.channel.ChannelException:
> Failed to bind to: master/192.168.125.174:0
>
> I doubt the problem is with 192.168.125.174:0. Eventhough the command
> contains master:7077,why its showing 0 in the log.
>
> java -cp
> ::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m
> org.apache.spark.deploy.worker.Worker spark://master:7077
>
> Can somebody tell me  a solution.
>
> Thanks & Regards,
> Meethu M
>
>
>   On Friday, 27 June 2014 4:28 PM, MEETHU MATHEW 
> wrote:
>
>
> Hi,
> ya I tried setting another PORT also,but the same problem..
> master is set in etc/hosts
>
> Thanks & Regards,
> Meethu M
>
>
>   On Friday, 27 June 2014 3:23 PM, Akhil Das 
> wrote:
>
>
> tha's strange, did you try setting the master port to something else (use
> SPARK_MASTER_PORT).
>
> Also you said you are able to start it from the java commandline
>
> java -cp ::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/
> assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m
> -Xmx512m org.apache.spark.deploy.worker.Worker spark://:*master*:7077
>
> What is the master ip specified here? is it like you have entry for
> *master* in the /etc/hosts?
>
> Thanks
> Best Regards
>
>
> On Fri, Jun 27, 2014 at 3:09 PM, MEETHU MATHEW 
> wrote:
>
> Hi Akhil,
>
> I am running it in a LAN itself..The IP of the master is given correctly.
>
> Thanks & Regards,
> Meethu M
>
>
>   On Friday, 27 June 2014 2:51 PM, Akhil Das 
> wrote:
>
>
>  why is it binding to port 0? 192.168.125.174:0 :/
>
> Check the ip address of that master machine (ifconfig) looks like the ip
> address has been changed (hoping you are running this machines on a LAN)
>
> Thanks
> Best Regards
>
>
> On Fri, Jun 27, 2014 at 12:00 PM, MEETHU MATHEW 
> wrote:
>
> Hi all,
>
> My Spark(Standalone mode) was running fine till yesterday.But now I am
> getting  the following exeception when I am running start-slaves.sh or
> start-all.sh
>
> slave3: failed to launch org.apache.spark.deploy.worker.Worker:
> slave3:   at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
> slave3:   at java.lang.Thread.run(Thread.java:662)
>
> The log files has the following lines.
>
> 14/06/27 11:06:30 INFO SecurityManager: Using Spark's default log4j
> profile: org/apache/spark/log4j-defaults.properties
> 14/06/27 11:06:30 INFO SecurityManager: Changing view acls to: hduser
> 14/06/27 11:06:30 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users with view permissions: Set(hduser)
> 14/06/27 11:06:30 INFO Slf4jLogger: Slf4jLogger started
> 14/06/27 11:06:30 INFO Remoting: Starting remoting
> Exception in thread "main" org.jboss.netty.channel.ChannelException:
> Failed to bind to: master/192.168.125.174:0
>  at
> org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
> ...
> Caused by: java.net.BindException: Cannot assign requested address
>  ...
> I saw the same error reported before and have tried the following
> solutions.
>
> Set the variable SPARK_LOCAL_IP ,Changed the SPARK_MASTER_PORT to a
> different number..But nothing is working.
>
> When I try to start the worker from the respective machines using the
> following java command,its running without any exception
>
> java -cp
> ::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m
> org.apache.spark.deploy.worker.Worker spark://:master:7077
>
> Somebody please give a solution
>
> Thanks & Regards,
> Meethu M
>
>
>
>
>
>
>
>
>
>


Re: org.jboss.netty.channel.ChannelException: Failed to bind to: master/1xx.xx..xx:0

2014-06-30 Thread MEETHU MATHEW
Hi all,

I reinstalled spark,reboot the system,but still I am not able to start the 
workers.Its throwing the following exception:

Exception in thread "main" org.jboss.netty.channel.ChannelException: Failed to 
bind to: master/192.168.125.174:0

I doubt the problem is with 192.168.125.174:0. Eventhough the command contains 
master:7077,why its showing 0 in the log.

java -cp 
::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
 -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
org.apache.spark.deploy.worker.Worker spark://master:7077

Can somebody tell me  a solution.
 
Thanks & Regards, 
Meethu M


On Friday, 27 June 2014 4:28 PM, MEETHU MATHEW  wrote:
 


Hi,
ya I tried setting another PORT also,but the same problem..
master is set in etc/hosts
 
Thanks & Regards, 
Meethu M


On Friday, 27 June 2014 3:23 PM, Akhil Das  wrote:
 


tha's strange, did you try setting the master port to something else (use 
SPARK_MASTER_PORT).

Also you said you are able to start it from the java commandline

java -cp 
::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
 -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
org.apache.spark.deploy.worker.Worker spark://:master:7077


What is the master ip specified here? is it like you have entry for master in 
the /etc/hosts? 


Thanks
Best Regards


On Fri, Jun 27, 2014 at 3:09 PM, MEETHU MATHEW  wrote:

Hi Akhil,
>
>
>I am running it in a LAN itself..The IP of the master is given correctly.
> 
>Thanks & Regards, 
>Meethu M
>
>
>
>On Friday, 27 June 2014 2:51 PM, Akhil Das  wrote:
> 
>
>
>why is it binding to port 0? 192.168.125.174:0 :/
>
>
>Check the ip address of that master machine (ifconfig) looks like the ip 
>address has been changed (hoping you are running this machines on a LAN)
>
>
>Thanks
>Best Regards
>
>
>On Fri, Jun 27, 2014 at 12:00 PM, MEETHU MATHEW  wrote:
>
>Hi all,
>>
>>
>>My Spark(Standalone mode) was running fine till yesterday.But now I am 
>>getting  the following exeception when I am running start-slaves.sh or 
>>start-all.sh
>>
>>
>>slave3: failed to launch org.apache.spark.deploy.worker.Worker:
>>slave3:   at 
>>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>>slave3:   at java.lang.Thread.run(Thread.java:662)
>>
>>
>>The log files has the following lines.
>>
>>
>>14/06/27 11:06:30 INFO SecurityManager: Using Spark's default log4j profile: 
>>org/apache/spark/log4j-defaults.properties
>>14/06/27 11:06:30 INFO SecurityManager: Changing view acls to: hduser
>>14/06/27 11:06:30 INFO SecurityManager: SecurityManager: authentication 
>>disabled; ui acls disabled; users with view permissions: Set(hduser)
>>14/06/27 11:06:30 INFO Slf4jLogger: Slf4jLogger started
>>14/06/27 11:06:30 INFO Remoting: Starting remoting
>>Exception in thread "main" org.jboss.netty.channel.ChannelException: Failed 
>>to bind to: master/192.168.125.174:0
>>at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
>>...
>>Caused by: java.net.BindException: Cannot assign requested address
>>...
>>I saw the same error reported before and have tried the following solutions.
>>
>>
>>Set the variable SPARK_LOCAL_IP ,Changed the SPARK_MASTER_PORT to a different 
>>number..But nothing is working.
>>
>>
>>When I try to start the worker from the respective machines using the 
>>following java command,its running without any exception
>>
>>
>>java -cp 
>>::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
>> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
>>org.apache.spark.deploy.worker.Worker spark://:master:7077
>>
>>
>>
>>Somebody please give a solution
>> 
>>Thanks & Regards, 
>>Meethu M
>
>
>

Re: org.jboss.netty.channel.ChannelException: Failed to bind to: master/1xx.xx..xx:0

2014-06-27 Thread MEETHU MATHEW
Hi Akhil,

The IP is correct and is able to start the workers when we start it as a java 
command.Its becoming 192.168.125.174:0  when we call from the scripts.


 
Thanks & Regards, 
Meethu M


On Friday, 27 June 2014 1:49 PM, Akhil Das  wrote:
 


why is it binding to port 0? 192.168.125.174:0 :/

Check the ip address of that master machine (ifconfig) looks like the ip 
address has been changed (hoping you are running this machines on a LAN)


Thanks
Best Regards


On Fri, Jun 27, 2014 at 12:00 PM, MEETHU MATHEW  wrote:

Hi all,
>
>
>My Spark(Standalone mode) was running fine till yesterday.But now I am getting 
> the following exeception when I am running start-slaves.sh or start-all.sh
>
>
>slave3: failed to launch org.apache.spark.deploy.worker.Worker:
>slave3:   at 
>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>slave3:   at java.lang.Thread.run(Thread.java:662)
>
>
>The log files has the following lines.
>
>
>14/06/27 11:06:30 INFO SecurityManager: Using Spark's default log4j profile: 
>org/apache/spark/log4j-defaults.properties
>14/06/27 11:06:30 INFO SecurityManager: Changing view acls to: hduser
>14/06/27 11:06:30 INFO SecurityManager: SecurityManager: authentication 
>disabled; ui acls disabled; users with view permissions: Set(hduser)
>14/06/27 11:06:30 INFO Slf4jLogger: Slf4jLogger started
>14/06/27 11:06:30 INFO Remoting: Starting remoting
>Exception in thread "main" org.jboss.netty.channel.ChannelException: Failed to 
>bind to: master/192.168.125.174:0
>at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
>...
>Caused by: java.net.BindException: Cannot assign requested address
>...
>I saw the same error reported before and have tried the following solutions.
>
>
>Set the variable SPARK_LOCAL_IP ,Changed the SPARK_MASTER_PORT to a different 
>number..But nothing is working.
>
>
>When I try to start the worker from the respective machines using the 
>following java command,its running without any exception
>
>
>java -cp 
>::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
>org.apache.spark.deploy.worker.Worker spark://:master:7077
>
>
>
>Somebody please give a solution
> 
>Thanks & Regards, 
>Meethu M

Re: org.jboss.netty.channel.ChannelException: Failed to bind to: master/1xx.xx..xx:0

2014-06-27 Thread Akhil Das
why is it binding to port 0? 192.168.125.174:0 :/

Check the ip address of that master machine (ifconfig) looks like the ip
address has been changed (hoping you are running this machines on a LAN)

Thanks
Best Regards


On Fri, Jun 27, 2014 at 12:00 PM, MEETHU MATHEW 
wrote:

> Hi all,
>
> My Spark(Standalone mode) was running fine till yesterday.But now I am
> getting  the following exeception when I am running start-slaves.sh or
> start-all.sh
>
> slave3: failed to launch org.apache.spark.deploy.worker.Worker:
> slave3:   at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
> slave3:   at java.lang.Thread.run(Thread.java:662)
>
> The log files has the following lines.
>
> 14/06/27 11:06:30 INFO SecurityManager: Using Spark's default log4j
> profile: org/apache/spark/log4j-defaults.properties
> 14/06/27 11:06:30 INFO SecurityManager: Changing view acls to: hduser
> 14/06/27 11:06:30 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users with view permissions: Set(hduser)
> 14/06/27 11:06:30 INFO Slf4jLogger: Slf4jLogger started
> 14/06/27 11:06:30 INFO Remoting: Starting remoting
> Exception in thread "main" org.jboss.netty.channel.ChannelException:
> Failed to bind to: master/192.168.125.174:0
> at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
> ...
> Caused by: java.net.BindException: Cannot assign requested address
> ...
> I saw the same error reported before and have tried the following
> solutions.
>
> Set the variable SPARK_LOCAL_IP ,Changed the SPARK_MASTER_PORT to a
> different number..But nothing is working.
>
> When I try to start the worker from the respective machines using the
> following java command,its running without any exception
>
> java -cp
> ::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m
> org.apache.spark.deploy.worker.Worker spark://:master:7077
>
> Somebody please give a solution
>
> Thanks & Regards,
> Meethu M
>


org.jboss.netty.channel.ChannelException: Failed to bind to: master/1xx.xx..xx:0

2014-06-26 Thread MEETHU MATHEW
Hi all,

My Spark(Standalone mode) was running fine till yesterday.But now I am getting  
the following exeception when I am running start-slaves.sh or start-all.sh

slave3: failed to launch org.apache.spark.deploy.worker.Worker:
slave3:   at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
slave3:   at java.lang.Thread.run(Thread.java:662)

The log files has the following lines.

14/06/27 11:06:30 INFO SecurityManager: Using Spark's default log4j profile: 
org/apache/spark/log4j-defaults.properties
14/06/27 11:06:30 INFO SecurityManager: Changing view acls to: hduser
14/06/27 11:06:30 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(hduser)
14/06/27 11:06:30 INFO Slf4jLogger: Slf4jLogger started
14/06/27 11:06:30 INFO Remoting: Starting remoting
Exception in thread "main" org.jboss.netty.channel.ChannelException: Failed to 
bind to: master/192.168.125.174:0
at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
...
Caused by: java.net.BindException: Cannot assign requested address
...
I saw the same error reported before and have tried the following solutions.

Set the variable SPARK_LOCAL_IP ,Changed the SPARK_MASTER_PORT to a different 
number..But nothing is working.

When I try to start the worker from the respective machines using the following 
java command,its running without any exception

java -cp 
::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
 -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
org.apache.spark.deploy.worker.Worker spark://:master:7077


Somebody please give a solution
 
Thanks & Regards, 
Meethu M