I think I found the cause. because one of the line is larger than the set 
limit, I tried to set the flume.event.max.size.bytes in the agent node and 
collector node but the system doesn't seem to take the values
====
  <property>
    <name>flume.event.max.size.bytes</name>
    <value>2076150</value>
    <description>The length of line content in byte.</description>
  </property>
=====

am I doing anything wrong?

Thanks

Vic
________________________________
From: Huang, Zijian(Victor) [mailto:zijian.hu...@etrade.com]
Sent: Monday, September 26, 2011 12:15 PM
To: flume-user@incubator.apache.org
Subject: Flume agent repeatedly streaming the same content forever (Version: 
0.9.3, r)

Hi,
   I have encountered this problem with Flume twice. Flume agent just keep 
sending the same log file again and again to the collector and filling up all 
the disk space in the collector host at the end. Do you guys know what exactly 
causes Flume to lost count of the lines and keep re-streaming. I saw it happen 
when I try to stream some binary logs, and I saw it happen today with normal 
logs(may contains some binary data). I can replicated the problem easily. I am 
using "tail" to stream the content over

Please let me know what are the potential causes.

Thanks

Vic


Reply via email to