Hello Anandkumar,

Thank you for your time and also for the reply.

Now, flume acts differently. I don't see any log errors, in fact my log now
looks as below;

2015-08-06 22:51:14,426 INFO
org.apache.flume.client.avro.ReliableSpoolingFileEv
                                                               entReader:
Preparing to delete file /home/a_nikhil.gopishetti/pnm/GVLLCMTK03-675

     79944.pf
2015-08-06 23:12:15,026 INFO
org.apache.flume.client.avro.ReliableSpoolingFileEv
                                                               entReader:
Preparing to delete file /home/a_nikhil.gopishetti/pnm/KNTNCMTK01.SUD

     DENLINK.NET-7682.pf
2015-08-06 23:12:22,030 INFO
org.apache.flume.client.avro.ReliableSpoolingFileEv
                                                               entReader:
Preparing to delete file /home/a_nikhil.gopishetti/pnm/KNTNCMTK01.SUD

     DENLINK.NET-7682.pf
2015-08-06 23:13:47,570 INFO
org.apache.flume.client.avro.ReliableSpoolingFileEv
                                                               entReader:
Preparing to delete file /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-340

     17296.pf.filepart
2015-08-06 23:13:57,076 INFO
org.apache.flume.client.avro.ReliableSpoolingFileEv
                                                               entReader:
Preparing to delete file /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-675

     84072.pf
2015-08-06 23:14:03,581 INFO
org.apache.flume.client.avro.ReliableSpoolingFileEv
                                                               entReader:
Preparing to delete file /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-675

     79944.pf
2015-08-06 23:23:23,348 INFO
org.apache.flume.client.avro.ReliableSpoolingFileEv
                                                               entReader:
Preparing to delete file /home/a_nikhil.gopishetti/pnm/PMLECMTK01.SUD

     DENLINK.NET-8426.pf.filepart


So the files are deleting. So when I go back to my destination i.e. Hbase
tables, I don't see any data. Earlier, a couple of days back, I could
clearly see that my data was deleting from the spool directory and it was
getting into my Hbase table. But now I don't see any data in my destination
tables. I don't know why. Any suggestions.

Thanks in advance for your time and reply.

Regards,
Nik.

On Thu, Aug 6, 2015 at 11:43 PM, Anandkumar Lakshmanan <[email protected]>
wrote:

> Hi Nik,
>
> Please verify the firewall settings. It blocks the connection it seems.
>
> Thanks
> Anand.
>
>
>
> On 08/07/2015 02:07 AM, Nikhil Gs wrote:
>
> Hello Team,
>
> Facing the below error very alternatively even though worked with
> different port numbers. I have pasted my flume config file along with the
> error.
>
> Thanks in advance.
>
>
> Below is my flume configuration
>
> #################################################
>
> # Please paste flume.conf here. Example:
>
> # Sources, channels, and sinks are defined per
> # agent name, in this case 'pnmtest2'.
> pnmtest2.sources  = SPOOL
> pnmtest2.channels = MemChanneltest2
> pnmtest2.sinks    = AVRO
>
> # For each source, channel, and sink, set
> # standard properties.
> pnmtest2.sources.SPOOL.type          = spooldir
> pnmtest2.sources.SPOOL.spoolDir      = /home/a.nikhill/pnm
> pnmtest2.sources.SPOOL.ignorePattern = \.*tmp$
> pnmtest2.sources.SPOOL.channels      = MemChanneltest2
> pnmtest2.sources.SPOOL.fileHeader    = true
> pnmtest2.sources.SPOOL.deletePolicy  = immediate
> pnmtest2.sources.SPOOL.consumeOrder  = oldest
> pnmtest2.sources.SPOOL.batchSize     = 100
>
> pnmtest2.sources.SPOOL.interceptors = time
> pnmtest2.sources.SPOOL.interceptors.time.type =
> org.apache.flume.interceptor.TimestampInterceptor$Builder
> pnmtest2.sources.SPOOL.deserializer  =
> com.sudnline.flume.WholeFileDeserializer$Builder
>
> pnmtest2.sinks.AVRO.type         = avro
> pnmtest2.sinks.AVRO.channel      = MemChanneltest2
> pnmtest2.sinks.AVRO.hostname = sdldalplhdw02.sudnline.cequel3.com
> <http://sdldalplhdw02.suddenlink.cequel3.com/>
> pnmtest2.sinks.AVRO.port     = 40002
> pnmtest2.sinks.AVRO.batch-size = 100
> pnmtest2.sinks.AVRO.connect-timeout = 40000
>
>
> # pnmtest2.sinks.HDFS.type         = hdfs
> # pnmtest2.sinks.HDFS.channel      = MemChannel2
> # pnmtest2.sinks.HDFS.hdfs.path = /user/flume/poll/%Y/%m/%d/%H/
> # pnmtest2.sinks.HDFS.hdfs.fileType = DataStream
> # pnmtest2.sinks.HDFS.hdfs.writeFormat = Text
> # pnmtest2.sinks.HDFS.hdfs.batchSize = 100
> # pnmtest2.sinks.HDFS.hdfs.rollSize = 0
> # pnmtest2.sinks.HDFS.hdfs.rollCount = 1000
> # pnmtest2.sinks.HDFS.hdfs.rollInterval = 600
>
> # Other properties are specific to each type of
> # source, channel, or sink. In this case, we
> # specify the capacity of the memory channel.
>
> #pnmtest2.channels.MemChanneltest1.capacity = 10000
> #pnmtest2.channels.MemChanneltest1.type   = memory
>
> pnmtest2.channels.MemChanneltest2.capacity = 1000000
> pnmtest2.channels.MemChanneltest2.type   = memory
>
>
> Below is my error.
>
>> ERROR org.apache.flume.SinkRunner
>>>
>>> Unable to deliver event. Exception follows.
>>> org.apache.flume.EventDeliveryException: Failed to send events
>>>     at 
>>> org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:392)
>>>     at 
>>> org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68)
>>>     at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147)
>>>     at java.lang.Thread.run(Thread.java:745)
>>> Caused by: org.apache.flume.FlumeException: NettyAvroRpcClient { host: 
>>> sdldalplhdw02.suddenlink.cequel3.com, port: 40002 }: RPC connection error
>>>     at 
>>> org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:182)
>>>     at 
>>> org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:121)
>>>     at 
>>> org.apache.flume.api.NettyAvroRpcClient.configure(NettyAvroRpcClient.java:638)
>>>     at 
>>> org.apache.flume.api.RpcClientFactory.getInstance(RpcClientFactory.java:89)
>>>     at org.apache.flume.sink.AvroSink.initializeRpcClient(AvroSink.java:127)
>>>     at 
>>> org.apache.flume.sink.AbstractRpcSink.createConnection(AbstractRpcSink.java:211)
>>>     at 
>>> org.apache.flume.sink.AbstractRpcSink.verifyConnection(AbstractRpcSink.java:272)
>>>     at 
>>> org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:349)
>>>     ... 3 more
>>> Caused by: java.io.IOException: Error connecting to 
>>> sdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002
>>>     at 
>>> org.apache.avro.ipc.NettyTransceiver.getChannel(NettyTransceiver.java:292)
>>>     at 
>>> org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:206)
>>>     at 
>>> org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:155)
>>>     at 
>>> org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:168)
>>>     ... 10 more
>>> Caused by: java.net.ConnectException: Connection refused
>>>     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>>>     at 
>>> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
>>>     at 
>>> org.jboss.netty.channel.socket.nio.NioClientBoss.connect(NioClientBoss.java:148)
>>>     at 
>>> org.jboss.netty.channel.socket.nio.NioClientBoss.processSelectedKeys(NioClientBoss.java:104)
>>>     at 
>>> org.jboss.netty.channel.socket.nio.NioClientBoss.process(NioClientBoss.java:78)
>>>     at 
>>> org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312)
>>>     at 
>>> org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:41)
>>>     at 
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>     at 
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>
>>>
>>>
>>>
> Regards,
> Nik.
>
>
>

Reply via email to