Typically flume components avoid modifying the event data. In this case you
could write a custom serializer for the hdfs sink which uses timestamp from
event header and writes it out along with the event body. Do consider the
following two alternatives first though...

1) Often the time at which the event occurred is more interesting than the
time at which flume processed it. so its better to put the timestamp into
the log event when it is generated by the application.

2) If you are interested in the time at which flume processed it, then you
may not care about minute or second level granularity very much. So you can
just setup hdfs sink to roll the file on HDFS every few minutes or so.

-roshan


On Mon, Jun 30, 2014 at 11:41 PM, Guillermo Ortiz <[email protected]>
wrote:

> Hello,
>
> I'm working with Flume 1.4, working with DataStream, I would like to add in
> each row the timestamp. I have gotten with SequenceFiles, but, I'm not sure
> if it's possible with DataStream, I have been looking up on the user guida
> and checking different options, but I have gotten nothing.  Is there an
> option for this??
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Reply via email to