RE: Bind exception while running FlumeEventCount

2014-11-11 Thread Jeniba Johnson
Hi Hari

Yes I started Flume agent to push data to the relevant port. Below  mentioned 
are the conf files for flume configurations

Test21.conf

# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# Describe/configure the source
a1.sources.r1.type = avro
a1.sources.r1.bind = localhost
a1.sources.r1.port = 2323

# Describe the sink
a1.sinks.k1.type = logger

# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

Command used is
bin/flume-ng agent -n a1 -c conf -f conf/test21.conf 
-Dflume.root.logger=INFO,console

Test12.conf

agent1.sources = seqGenSrc
agent1.sinks = avrosink
agent1.channels = memoryChannel

agent1.sources.seqGenSrc.type = exec
agent1.sources.seqGenSrc.command = tail -f /home/huser/access.log
agent1.sources.seqGenSrc.batch-size = 1

agent1.sinks.avrosink.type = avro
agent1.sinks.avrosink.hostname = localhost
agent1.sinks.avrosink.port = 2323
agent1.sinks.arvosink.batch-size = 100
agent1.sinks.arvosink.connect-timeout = 6
agent1.sinks.avrosink.request-timeout = 6

agent1.channels.memoryChannel.type = memory
agent1.channels.memoryChannel.capacity = 1000
agent1.channels.memoryChannel.transactionCapacity = 100

agent1.sources.seqGenSrc.channels = memoryChannel
agent1.sinks.avrosink.channel = memoryChannel


Command used is
bin/flume-ng agent -n agent1 -c conf -f conf/test12.conf 
-Dflume.root.logger=DEBUG,console

Even after changing the port several times, still Iam facing with the same 
issues,
Kindly look into my conf file and just let me know the steps.


Regards,
Jeniba Johnson
From: Hari Shreedharan [mailto:hshreedha...@cloudera.com]
Sent: Tuesday, November 11, 2014 1:06 PM
To: Jeniba Johnson
Cc: dev@spark.apache.org
Subject: Re: Bind exception while running FlumeEventCount

Did you start a Flume agent to push data to the relevant port?

Thanks,
Hari


On Fri, Nov 7, 2014 at 2:05 PM, Jeniba Johnson 
jeniba.john...@lntinfotech.commailto:jeniba.john...@lntinfotech.com wrote:

Hi,

I have installed spark-1.1.0 and apache flume 1.4 for running streaming example 
FlumeEventCount. Previously the code was working fine. Now Iam facing with the 
below mentioned issues. My flume is running properly it is able to write the 
file.

The command I use is

bin/run-example org.apache.spark.examples.streaming.FlumeEventCount 
172.29.17.178 65001


14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Stopping receiver with 
message: Error starting receiver 0: org.jboss.netty.channel.ChannelException: 
Failed to bind to: /172.29.17.178:65001
14/11/07 23:19:23 INFO flume.FlumeReceiver: Flume receiver stopped
14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Called receiver onStop
14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Deregistering receiver 0
14/11/07 23:19:23 ERROR scheduler.ReceiverTracker: Deregistered receiver for 
stream 0: Error starting receiver 0 - org.jboss.netty.channel.ChannelException: 
Failed to bind to: /172.29.17.178:65001
at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
at org.apache.avro.ipc.NettyServer.init(NettyServer.java:106)
at org.apache.avro.ipc.NettyServer.init(NettyServer.java:119)
at org.apache.avro.ipc.NettyServer.init(NettyServer.java:74)
at org.apache.avro.ipc.NettyServer.init(NettyServer.java:68)
at 
org.apache.spark.streaming.flume.FlumeReceiver.initServer(FlumeInputDStream.scala:164)
at 
org.apache.spark.streaming.flume.FlumeReceiver.onStart(FlumeInputDStream.scala:171)
at 
org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121)
at 
org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106)
at 
org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:264)
at 
org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:257)
at 
org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
at 
org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
at org.apache.spark.scheduler.Task.run(Task.scala:54)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:344)
at sun.nio.ch.Net.bind(Net.java:336)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:199)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74

Re: Bind exception while running FlumeEventCount

2014-11-10 Thread Hari Shreedharan
Looks like that port is not available because another app is using that port. 
Can you take a look at netstat -a and use a port that is free?


Thanks,
Hari

On Fri, Nov 7, 2014 at 2:05 PM, Jeniba Johnson
jeniba.john...@lntinfotech.com wrote:

 Hi,
 I have installed spark-1.1.0 and  apache flume 1.4 for running  streaming 
 example FlumeEventCount. Previously the code was working fine. Now Iam facing 
 with the below mentioned issues. My flume is running properly it is able to 
 write the file.
 The command I use is
 bin/run-example org.apache.spark.examples.streaming.FlumeEventCount 
 172.29.17.178  65001
 14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Stopping receiver 
 with message: Error starting receiver 0: 
 org.jboss.netty.channel.ChannelException: Failed to bind to: 
 /172.29.17.178:65001
 14/11/07 23:19:23 INFO flume.FlumeReceiver: Flume receiver stopped
 14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Called receiver onStop
 14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Deregistering 
 receiver 0
 14/11/07 23:19:23 ERROR scheduler.ReceiverTracker: Deregistered receiver for 
 stream 0: Error starting receiver 0 - 
 org.jboss.netty.channel.ChannelException: Failed to bind to: 
 /172.29.17.178:65001
 at 
 org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:106)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:119)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:74)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:68)
 at 
 org.apache.spark.streaming.flume.FlumeReceiver.initServer(FlumeInputDStream.scala:164)
 at 
 org.apache.spark.streaming.flume.FlumeReceiver.onStart(FlumeInputDStream.scala:171)
 at 
 org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121)
 at 
 org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106)
 at 
 org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:264)
 at 
 org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:257)
 at 
 org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
 at 
 org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
 at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
 at org.apache.spark.scheduler.Task.run(Task.scala:54)
 at 
 org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
 at 
 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
 at 
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
 at java.lang.Thread.run(Thread.java:722)
 Caused by: java.net.BindException: Address already in use
 at sun.nio.ch.Net.bind0(Native Method)
 at sun.nio.ch.Net.bind(Net.java:344)
 at sun.nio.ch.Net.bind(Net.java:336)
 at 
 sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:199)
 at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
 at 
 org.jboss.netty.channel.socket.nio.NioServerBoss$RegisterTask.run(NioServerBoss.java:193)
 at 
 org.jboss.netty.channel.socket.nio.AbstractNioSelector.processTaskQueue(AbstractNioSelector.java:366)
 at 
 org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:290)
 at 
 org.jboss.netty.channel.socket.nio.NioServerBoss.run(NioServerBoss.java:42)
 ... 3 more
 14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Stopped receiver 0
 14/11/07 23:19:23 INFO receiver.BlockGenerator: Stopping BlockGenerator
 14/11/07 23:19:23 INFO util.RecurringTimer: Stopped timer for BlockGenerator 
 after time 1415382563200
 14/11/07 23:19:23 INFO receiver.BlockGenerator: Waiting for block pushing 
 thread
 14/11/07 23:19:23 INFO receiver.BlockGenerator: Pushing out the last 0 blocks
 14/11/07 23:19:23 INFO receiver.BlockGenerator: Stopped block pushing thread
 14/11/07 23:19:23 INFO receiver.BlockGenerator: Stopped BlockGenerator
 14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Waiting for executor 
 stop is over
 14/11/07 23:19:23 ERROR receiver.ReceiverSupervisorImpl: Stopped executor 
 with error: org.jboss.netty.channel.ChannelException: Failed to bind to: 
 /172.29.17.178:65001
 14/11/07 23:19:23 ERROR executor.Executor: Exception in task 0.0 in stage 0.0 
 (TID 0)
 org.jboss.netty.channel.ChannelException: Failed to bind to: 
 /172.29.17.178:65001
 at 
 org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:106)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:119)
 at 

RE: Bind exception while running FlumeEventCount

2014-11-10 Thread Jeniba Johnson
Hi Hari

Just to give you a background , I had  installed spark-1.1.0 and apache flume 
1.4 with basic configurations as needed. I just wanted to know that
Is this the correct way for running Spark streaming examples with Flume.

So  As you had mentioned about the TIME_WAIT parameter, did not get exactly.. 
Iam attaching the screenshot ,so that you can help me with it
The screenshot specify the ports listening after the program is executed


Regards,
Jeniba Johnson

-Original Message-
From: Hari Shreedharan [mailto:hshreedha...@cloudera.com]
Sent: Tuesday, November 11, 2014 11:04 AM
To: Jeniba Johnson
Cc: dev@spark.apache.org
Subject: RE: Bind exception while running FlumeEventCount

The socket may have been in TIME_WAIT. Can you try after a bit? The error 
message definitely suggests that some other app is listening on that port.


Thanks,
Hari

On Mon, Nov 10, 2014 at 9:30 PM, Jeniba Johnson 
jeniba.john...@lntinfotech.com wrote:

 Hi Hari
 Thanks for your kind reply
 Even after killing the process id  of the specific port. Still Iam facing 
 with the similar error.
 The command I use is
 sudo lsof -i -P | grep -i listen
 Kill -9 PID
 However If I try to work with the port which is available, still the error 
 remains the same.
 Regards,
 Jeniba Johnson
 From: Hari Shreedharan [mailto:hshreedha...@cloudera.com]
 Sent: Tuesday, November 11, 2014 4:41 AM
 To: Jeniba Johnson
 Cc: dev@spark.apache.org
 Subject: Re: Bind exception while running FlumeEventCount Looks like
 that port is not available because another app is using that port. Can you 
 take a look at netstat -a and use a port that is free?
 Thanks,
 Hari
 On Fri, Nov 7, 2014 at 2:05 PM, Jeniba Johnson 
 jeniba.john...@lntinfotech.commailto:jeniba.john...@lntinfotech.com wrote:
 Hi,
 I have installed spark-1.1.0 and apache flume 1.4 for running streaming 
 example FlumeEventCount. Previously the code was working fine. Now Iam facing 
 with the below mentioned issues. My flume is running properly it is able to 
 write the file.
 The command I use is
 bin/run-example org.apache.spark.examples.streaming.FlumeEventCount
 172.29.17.178 65001
 14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Stopping
 receiver with message: Error starting receiver 0:
 org.jboss.netty.channel.ChannelException: Failed to bind to:
 /172.29.17.178:65001
 14/11/07 23:19:23 INFO flume.FlumeReceiver: Flume receiver stopped
 14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Called
 receiver onStop
 14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Deregistering
 receiver 0
 14/11/07 23:19:23 ERROR scheduler.ReceiverTracker: Deregistered
 receiver for stream 0: Error starting receiver 0 -
 org.jboss.netty.channel.ChannelException: Failed to bind to:
 /172.29.17.178:65001 at
 org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:27
 2) at org.apache.avro.ipc.NettyServer.init(NettyServer.java:106)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:119)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:74)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:68)
 at
 org.apache.spark.streaming.flume.FlumeReceiver.initServer(FlumeInputDS
 tream.scala:164) at
 org.apache.spark.streaming.flume.FlumeReceiver.onStart(FlumeInputDStre
 am.scala:171) at
 org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(R
 eceiverSupervisor.scala:121) at
 org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverS
 upervisor.scala:106) at
 org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$
 $anonfun$9.apply(ReceiverTracker.scala:264)
 at
 org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$
 $anonfun$9.apply(ReceiverTracker.scala:257)
 at
 org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.sca
 la:1121) at
 org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.sca
 la:1121) at
 org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
 at org.apache.spark.scheduler.Task.run(Task.scala:54)
 at
 org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
 at
 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.j
 ava:1145) at
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.
 java:615) at java.lang.Thread.run(Thread.java:722)
 Caused by: java.net.BindException: Address already in use at
 sun.nio.ch.Net.bind0(Native Method) at
 sun.nio.ch.Net.bind(Net.java:344) at sun.nio.ch.Net.bind(Net.java:336)
 at
 sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:1
 99) at
 sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
 at
 org.jboss.netty.channel.socket.nio.NioServerBoss$RegisterTask.run(NioS
 erverBoss.java:193) at
 org.jboss.netty.channel.socket.nio.AbstractNioSelector.processTaskQueu
 e(AbstractNioSelector.java:366) at
 org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNio
 Selector.java:290) at
 org.jboss.netty.channel.socket.nio.NioServerBoss.run

RE: Bind exception while running FlumeEventCount

2014-11-10 Thread Hari Shreedharan
First, can you try a different port?




TIME_WAIT is basically a timeout for a socket to be completely decommissioned 
for the port to be available for binding. Once you wait for a few minutes and 
if you still see a startup issue, can you also send the error logs? From what I 
can see, the port seems to be in use.


Thanks,
Hari

Re: Bind exception while running FlumeEventCount

2014-11-10 Thread Hari Shreedharan
Did you start a Flume agent to push data to the relevant port?


Thanks,
Hari

On Fri, Nov 7, 2014 at 2:05 PM, Jeniba Johnson
jeniba.john...@lntinfotech.com wrote:

 Hi,
 I have installed spark-1.1.0 and  apache flume 1.4 for running  streaming 
 example FlumeEventCount. Previously the code was working fine. Now Iam facing 
 with the below mentioned issues. My flume is running properly it is able to 
 write the file.
 The command I use is
 bin/run-example org.apache.spark.examples.streaming.FlumeEventCount 
 172.29.17.178  65001
 14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Stopping receiver 
 with message: Error starting receiver 0: 
 org.jboss.netty.channel.ChannelException: Failed to bind to: 
 /172.29.17.178:65001
 14/11/07 23:19:23 INFO flume.FlumeReceiver: Flume receiver stopped
 14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Called receiver onStop
 14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Deregistering 
 receiver 0
 14/11/07 23:19:23 ERROR scheduler.ReceiverTracker: Deregistered receiver for 
 stream 0: Error starting receiver 0 - 
 org.jboss.netty.channel.ChannelException: Failed to bind to: 
 /172.29.17.178:65001
 at 
 org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:106)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:119)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:74)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:68)
 at 
 org.apache.spark.streaming.flume.FlumeReceiver.initServer(FlumeInputDStream.scala:164)
 at 
 org.apache.spark.streaming.flume.FlumeReceiver.onStart(FlumeInputDStream.scala:171)
 at 
 org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121)
 at 
 org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106)
 at 
 org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:264)
 at 
 org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:257)
 at 
 org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
 at 
 org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
 at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
 at org.apache.spark.scheduler.Task.run(Task.scala:54)
 at 
 org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
 at 
 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
 at 
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
 at java.lang.Thread.run(Thread.java:722)
 Caused by: java.net.BindException: Address already in use
 at sun.nio.ch.Net.bind0(Native Method)
 at sun.nio.ch.Net.bind(Net.java:344)
 at sun.nio.ch.Net.bind(Net.java:336)
 at 
 sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:199)
 at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
 at 
 org.jboss.netty.channel.socket.nio.NioServerBoss$RegisterTask.run(NioServerBoss.java:193)
 at 
 org.jboss.netty.channel.socket.nio.AbstractNioSelector.processTaskQueue(AbstractNioSelector.java:366)
 at 
 org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:290)
 at 
 org.jboss.netty.channel.socket.nio.NioServerBoss.run(NioServerBoss.java:42)
 ... 3 more
 14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Stopped receiver 0
 14/11/07 23:19:23 INFO receiver.BlockGenerator: Stopping BlockGenerator
 14/11/07 23:19:23 INFO util.RecurringTimer: Stopped timer for BlockGenerator 
 after time 1415382563200
 14/11/07 23:19:23 INFO receiver.BlockGenerator: Waiting for block pushing 
 thread
 14/11/07 23:19:23 INFO receiver.BlockGenerator: Pushing out the last 0 blocks
 14/11/07 23:19:23 INFO receiver.BlockGenerator: Stopped block pushing thread
 14/11/07 23:19:23 INFO receiver.BlockGenerator: Stopped BlockGenerator
 14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Waiting for executor 
 stop is over
 14/11/07 23:19:23 ERROR receiver.ReceiverSupervisorImpl: Stopped executor 
 with error: org.jboss.netty.channel.ChannelException: Failed to bind to: 
 /172.29.17.178:65001
 14/11/07 23:19:23 ERROR executor.Executor: Exception in task 0.0 in stage 0.0 
 (TID 0)
 org.jboss.netty.channel.ChannelException: Failed to bind to: 
 /172.29.17.178:65001
 at 
 org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:106)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:119)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:74)
 at 

RE: Bind exception while running FlumeEventCount

2014-11-10 Thread Jeniba Johnson
Hi Hari

Meanwhile Iam  trying out with different port. I need to confirm with you about 
the installation for Spark and Flume.
For installation, I have  just unzipped spark-1.1.0-bin-hadoop1.tar.gz and  
apache-flume-1.4.0-bin.tar.gz for running spark streaming examples.
Is this the correct way or else Is there any other way, then just let me know.

Awaiting for your kind reply.

Regards,
Jeniba Johnson
From: Hari Shreedharan [mailto:hshreedha...@cloudera.com]
Sent: Tuesday, November 11, 2014 12:41 PM
To: Jeniba Johnson
Cc: dev@spark.apache.org
Subject: RE: Bind exception while running FlumeEventCount

First, can you try a different port?

TIME_WAIT is basically a timeout for a socket to be completely decommissioned 
for the port to be available for binding. Once you wait for a few minutes and 
if you still see a startup issue, can you also send the error logs? From what I 
can see, the port seems to be in use.

Thanks,
Hari


On Mon, Nov 10, 2014 at 11:07 PM, Jeniba Johnson 
jeniba.john...@lntinfotech.commailto:jeniba.john...@lntinfotech.com wrote:

Hi Hari

Just to give you a background , I had installed spark-1.1.0 and apache flume 
1.4 with basic configurations as needed. I just wanted to know that
Is this the correct way for running Spark streaming examples with Flume.

So As you had mentioned about the TIME_WAIT parameter, did not get exactly.. 
Iam attaching the screenshot ,so that you can help me with it
The screenshot specify the ports listening after the program is executed


Regards,
Jeniba Johnson

-Original Message-
From: Hari Shreedharan [mailto:hshreedha...@cloudera.com]
Sent: Tuesday, November 11, 2014 11:04 AM
To: Jeniba Johnson
Cc: dev@spark.apache.orgmailto:dev@spark.apache.org
Subject: RE: Bind exception while running FlumeEventCount

The socket may have been in TIME_WAIT. Can you try after a bit? The error 
message definitely suggests that some other app is listening on that port.


Thanks,
Hari

On Mon, Nov 10, 2014 at 9:30 PM, Jeniba Johnson 
jeniba.john...@lntinfotech.commailto:jeniba.john...@lntinfotech.com wrote:

 Hi Hari
 Thanks for your kind reply
 Even after killing the process id of the specific port. Still Iam facing with 
 the similar error.
 The command I use is
 sudo lsof -i -P | grep -i listen
 Kill -9 PID
 However If I try to work with the port which is available, still the error 
 remains the same.
 Regards,
 Jeniba Johnson
 From: Hari Shreedharan [mailto:hshreedha...@cloudera.com]
 Sent: Tuesday, November 11, 2014 4:41 AM
 To: Jeniba Johnson
 Cc: dev@spark.apache.orgmailto:dev@spark.apache.org
 Subject: Re: Bind exception while running FlumeEventCount Looks like
 that port is not available because another app is using that port. Can you 
 take a look at netstat -a and use a port that is free?
 Thanks,
 Hari
 On Fri, Nov 7, 2014 at 2:05 PM, Jeniba Johnson 
 jeniba.john...@lntinfotech.commailto:jeniba.john...@lntinfotech.commailto:jeniba.john...@lntinfotech.com%3cmailto:jeniba.john...@lntinfotech.com
  wrote:
 Hi,
 I have installed spark-1.1.0 and apache flume 1.4 for running streaming 
 example FlumeEventCount. Previously the code was working fine. Now Iam facing 
 with the below mentioned issues. My flume is running properly it is able to 
 write the file.
 The command I use is
 bin/run-example org.apache.spark.examples.streaming.FlumeEventCount
 172.29.17.178 65001
 14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Stopping
 receiver with message: Error starting receiver 0:
 org.jboss.netty.channel.ChannelException: Failed to bind to:
 /172.29.17.178:65001
 14/11/07 23:19:23 INFO flume.FlumeReceiver: Flume receiver stopped
 14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Called
 receiver onStop
 14/11/07 23:19:23 INFO receiver.ReceiverSupervisorImpl: Deregistering
 receiver 0
 14/11/07 23:19:23 ERROR scheduler.ReceiverTracker: Deregistered
 receiver for stream 0: Error starting receiver 0 -
 org.jboss.netty.channel.ChannelException: Failed to bind to:
 /172.29.17.178:65001 at
 org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:27
 2) at org.apache.avro.ipc.NettyServer.init(NettyServer.java:106)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:119)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:74)
 at org.apache.avro.ipc.NettyServer.init(NettyServer.java:68)
 at
 org.apache.spark.streaming.flume.FlumeReceiver.initServer(FlumeInputDS
 tream.scala:164) at
 org.apache.spark.streaming.flume.FlumeReceiver.onStart(FlumeInputDStre
 am.scala:171) at
 org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(R
 eceiverSupervisor.scala:121) at
 org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverS
 upervisor.scala:106) at
 org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$
 $anonfun$9.apply(ReceiverTracker.scala:264)
 at
 org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$
 $anonfun$9.apply(ReceiverTracker.scala:257)
 at
 org.apache.spark.SparkContext