Re: Getting java.net.BindException when attempting to start Spark master on EC2 node with public IP

2015-07-28 Thread Steve Loughran
try looking at the causes and steps here

https://wiki.apache.org/hadoop/BindException



On 28 Jul 2015, at 09:22, Wayne Song 
mailto:wayne.e.s...@gmail.com>> wrote:

I made this message with the Nabble web interface; I included the stack trace 
there, but I guess it didn't show up in the emails.  Anyways, here's the stack 
trace:

15/07/27 17:04:09 ERROR NettyTransport: failed to bind to /54.xx.xx.xx:7093, 
shutting down Netty transport Exception in thread "main" 
java.net.BindException: Failed to bind to: /54.xx.xx.xx:7093: Service 
'sparkMaster' failed after 16 retries! at 
org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272) at 
akka.remote.transport.netty.NettyTransport$anonfun$listen$1.apply(NettyTransport.scala:393)
 at 
akka.remote.transport.netty.NettyTransport$anonfun$listen$1.apply(NettyTransport.scala:389)
 at scala.util.Success$anonfun$map$1.apply(Try.scala:206) at 
scala.util.Try$.apply(Try.scala:161) at scala.util.Success.map(Try.scala:206) 
at scala.concurrent.Future$anonfun$map$1.apply(Future.scala:235) at 
scala.concurrent.Future$anonfun$map$1.apply(Future.scala:235) at 
scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32) at 
akka.dispatch.BatchingExecutor$Batch$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
 at 
akka.dispatch.BatchingExecutor$Batch$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
 at 
akka.dispatch.BatchingExecutor$Batch$anonfun$run$1.apply(BatchingExecutor.scala:59)
 at 
akka.dispatch.BatchingExecutor$Batch$anonfun$run$1.apply(BatchingExecutor.scala:59)
 at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72) at 
akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58) at 
akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41) at 
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
 at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at 
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
 at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at 
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

I'm using Spark 1.4.0.

Binding to 0.0.0.0 works, but then workers can't connect to the Spark master, 
because when you start a worker, you have to give it the Spark master URL in 
the form spark://:7077.  My understanding is that because of 
Akka, you have to bind to the exact hostname that you used when you started the 
Spark master; thus, you can't bind to 0.0.0.0 on the Spark master machine and 
then connect to spark://54.xx.xx.xx:7077 or whatever.

On Tue, Jul 28, 2015 at 6:15 AM, Ted Yu 
mailto:yuzhih...@gmail.com>> wrote:
Can you show the full stack trace ?

Which Spark release are you using ?

Thanks



> On Jul 27, 2015, at 10:07 AM, Wayne Song 
> mailto:wayne.e.s...@gmail.com>> wrote:
>
> Hello,
>
> I am trying to start a Spark master for a standalone cluster on an EC2 node.
> The CLI command I'm using looks like this:
>
>
>
> Note that I'm specifying the --host argument; I want my Spark master to be
> listening on a specific IP address.  The host that I'm specifying (i.e.
> 54.xx.xx.xx) is the public IP for my EC2 node; I've confirmed that nothing
> else is listening on port 7077 and that my EC2 security group has all ports
> open.  I've also double-checked that the public IP is correct.
>
> When I use --host 54.xx.xx.xx, I get the following error message:
>
>
>
> This does not occur if I leave out the --host argument and it doesn't occur
> if I use --host 10.0.xx.xx, where 10.0.xx.xx is my private EC2 IP address.
>
> Why would Spark fail to bind to a public EC2 address?
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Getting-java-net-BindException-when-attempting-to-start-Spark-master-on-EC2-node-with-public-IP-tp24011.html
> Sent from the Apache Spark User List mailing list archive at 
> Nabble.com.
>
> -
> To unsubscribe, e-mail: 
> user-unsubscr...@spark.apache.org
> For additional commands, e-mail: 
> user-h...@spark.apache.org
>




Re: Getting java.net.BindException when attempting to start Spark master on EC2 node with public IP

2015-07-28 Thread Wayne Song
I made this message with the Nabble web interface; I included the stack
trace there, but I guess it didn't show up in the emails.  Anyways, here's
the stack trace:

15/07/27 17:04:09 ERROR NettyTransport: failed to bind to
/54.xx.xx.xx:7093, shutting down Netty transport Exception in thread "main"
java.net.BindException: Failed to bind to: /54.xx.xx.xx:7093: Service
'sparkMaster' failed after 16 retries! at
org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272) at
akka.remote.transport.netty.NettyTransport$anonfun$listen$1.apply(NettyTransport.scala:393)
at
akka.remote.transport.netty.NettyTransport$anonfun$listen$1.apply(NettyTransport.scala:389)
at scala.util.Success$anonfun$map$1.apply(Try.scala:206) at
scala.util.Try$.apply(Try.scala:161) at
scala.util.Success.map(Try.scala:206) at
scala.concurrent.Future$anonfun$map$1.apply(Future.scala:235) at
scala.concurrent.Future$anonfun$map$1.apply(Future.scala:235) at
scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32) at
akka.dispatch.BatchingExecutor$Batch$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
at
akka.dispatch.BatchingExecutor$Batch$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
at
akka.dispatch.BatchingExecutor$Batch$anonfun$run$1.apply(BatchingExecutor.scala:59)
at
akka.dispatch.BatchingExecutor$Batch$anonfun$run$1.apply(BatchingExecutor.scala:59)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58) at
akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41) at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

I'm using Spark 1.4.0.

Binding to 0.0.0.0 works, but then workers can't connect to the Spark
master, because when you start a worker, you have to give it the Spark
master URL in the form spark://:7077.  My understanding
is that because of Akka, you have to bind to the exact hostname that you
used when you started the Spark master; thus, you can't bind to 0.0.0.0 on
the Spark master machine and then connect to spark://54.xx.xx.xx:7077 or
whatever.

On Tue, Jul 28, 2015 at 6:15 AM, Ted Yu  wrote:

> Can you show the full stack trace ?
>
> Which Spark release are you using ?
>
> Thanks
>
>
>
> > On Jul 27, 2015, at 10:07 AM, Wayne Song  wrote:
> >
> > Hello,
> >
> > I am trying to start a Spark master for a standalone cluster on an EC2
> node.
> > The CLI command I'm using looks like this:
> >
> >
> >
> > Note that I'm specifying the --host argument; I want my Spark master to
> be
> > listening on a specific IP address.  The host that I'm specifying (i.e.
> > 54.xx.xx.xx) is the public IP for my EC2 node; I've confirmed that
> nothing
> > else is listening on port 7077 and that my EC2 security group has all
> ports
> > open.  I've also double-checked that the public IP is correct.
> >
> > When I use --host 54.xx.xx.xx, I get the following error message:
> >
> >
> >
> > This does not occur if I leave out the --host argument and it doesn't
> occur
> > if I use --host 10.0.xx.xx, where 10.0.xx.xx is my private EC2 IP
> address.
> >
> > Why would Spark fail to bind to a public EC2 address?
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Getting-java-net-BindException-when-attempting-to-start-Spark-master-on-EC2-node-with-public-IP-tp24011.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > -
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
>


Re: Getting java.net.BindException when attempting to start Spark master on EC2 node with public IP

2015-07-28 Thread Ted Yu
Can you show the full stack trace ?

Which Spark release are you using ?

Thanks



> On Jul 27, 2015, at 10:07 AM, Wayne Song  wrote:
> 
> Hello,
> 
> I am trying to start a Spark master for a standalone cluster on an EC2 node. 
> The CLI command I'm using looks like this:
> 
> 
> 
> Note that I'm specifying the --host argument; I want my Spark master to be
> listening on a specific IP address.  The host that I'm specifying (i.e.
> 54.xx.xx.xx) is the public IP for my EC2 node; I've confirmed that nothing
> else is listening on port 7077 and that my EC2 security group has all ports
> open.  I've also double-checked that the public IP is correct.
> 
> When I use --host 54.xx.xx.xx, I get the following error message:
> 
> 
> 
> This does not occur if I leave out the --host argument and it doesn't occur
> if I use --host 10.0.xx.xx, where 10.0.xx.xx is my private EC2 IP address.
> 
> Why would Spark fail to bind to a public EC2 address?
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Getting-java-net-BindException-when-attempting-to-start-Spark-master-on-EC2-node-with-public-IP-tp24011.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Getting java.net.BindException when attempting to start Spark master on EC2 node with public IP

2015-07-28 Thread Akhil Das
Did you try binding to 0.0.0.0?

Thanks
Best Regards

On Mon, Jul 27, 2015 at 10:37 PM, Wayne Song  wrote:

> Hello,
>
> I am trying to start a Spark master for a standalone cluster on an EC2
> node.
> The CLI command I'm using looks like this:
>
>
>
> Note that I'm specifying the --host argument; I want my Spark master to be
> listening on a specific IP address.  The host that I'm specifying (i.e.
> 54.xx.xx.xx) is the public IP for my EC2 node; I've confirmed that nothing
> else is listening on port 7077 and that my EC2 security group has all ports
> open.  I've also double-checked that the public IP is correct.
>
> When I use --host 54.xx.xx.xx, I get the following error message:
>
>
>
> This does not occur if I leave out the --host argument and it doesn't occur
> if I use --host 10.0.xx.xx, where 10.0.xx.xx is my private EC2 IP address.
>
> Why would Spark fail to bind to a public EC2 address?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Getting-java-net-BindException-when-attempting-to-start-Spark-master-on-EC2-node-with-public-IP-tp24011.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Getting java.net.BindException when attempting to start Spark master on EC2 node with public IP

2015-07-27 Thread Wayne Song
Hello,

I am trying to start a Spark master for a standalone cluster on an EC2 node. 
The CLI command I'm using looks like this:



Note that I'm specifying the --host argument; I want my Spark master to be
listening on a specific IP address.  The host that I'm specifying (i.e.
54.xx.xx.xx) is the public IP for my EC2 node; I've confirmed that nothing
else is listening on port 7077 and that my EC2 security group has all ports
open.  I've also double-checked that the public IP is correct.

When I use --host 54.xx.xx.xx, I get the following error message:



This does not occur if I leave out the --host argument and it doesn't occur
if I use --host 10.0.xx.xx, where 10.0.xx.xx is my private EC2 IP address.

Why would Spark fail to bind to a public EC2 address?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Getting-java-net-BindException-when-attempting-to-start-Spark-master-on-EC2-node-with-public-IP-tp24011.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org