Thanks Mike,

Unfortunately the following URL's are not working:

https://svn.apache.org/viewvc/incubator/flume/trunk/flume-ng-core/src/test/java/org/apache/flume/serialization/SyslogAvroEventSerializer.java?view=markup

https://svn.apache.org/viewvc/incubator/flume/trunk/flume-ng-core/src/test/java/org/apache/flume/serialization/

Thanks
JP


On Tue, Jul 31, 2012 at 4:03 AM, Mike Percy <[email protected]> wrote:

> Check out the source code for the appender to see which headers you need
> to write.
>
>
> https://github.com/apache/flume/blob/trunk/flume-ng-clients/flume-ng-log4jappender/src/main/java/org/apache/flume/clients/log4jappender/Log4jAppender.java#L96
>
> If you want to verify that the headers are being passed, try using a
> logger sink in your Flume agent for debugging purposes.
>
> If you want an example of writing an EventSerializer, I wrote up a little
> bit of info here:
> https://cwiki.apache.org/confluence/display/FLUME/Flume+1.x+Event+Serializers
>
> Regards,
> Mike
>
>
>
> On Mon, Jul 30, 2012 at 11:22 AM, Hari Shreedharan <
> [email protected]> wrote:
>
>>  You are using the AvroEventSerializer. This formats the event into Avro
>> format specified by 
>> org.apache.flume.serialization.FlumeEventAvroEventSerializer,
>> which is why it looks like garbage, while it is not. Your app should be
>> written to read and understand the Avro format. If you need it to human
>> readable, you will need to write your own serializer, perhaps by extending
>> the BodyTextEventSerializer.
>>
>> Thanks
>> Hari
>>
>> --
>> Hari Shreedharan
>>
>> On Monday, July 30, 2012 at 9:34 AM, JP wrote:
>>
>> Thanks Hari ,
>>
>> i got little progress.
>>
>> But im getting garbage values.
>>
>> this is my configurations:
>>
>> *flume-conf.properties*
>> ---------------------------------------
>> agent2.sources = seqGenSrc
>> agent2.channels = memoryChannel
>> agent2.sinks = loggerSink
>>
>> agent2.sources.seqGenSrc.type = avro
>> agent2.sources.seqGenSrc.bind=localhost
>> agent2.sources.seqGenSrc.port=41414
>>
>> agent2.channels.memoryChannel.type = memory
>> agent2.channels.memoryChannel.capacity = 1000000
>> agent2.channels.memoryChannel.transactionCapacity = 1000000
>> agent2.channels.memoryChannel.keep-alive = 30
>>
>> agent2.sources.seqGenSrc.channels = memoryChannel
>>
>> agent2.sinks.loggerSink.type = hdfs
>> agent2.sinks.loggerSink.hdfs.path = hdfs://ip:portno/data/CspcLogs
>> agent2.sinks.loggerSink.hdfs.fileType = DataStream
>> agent2.sinks.loggerSink.channel = memoryChannel
>> agent2.sinks.loggerSink.serializer = avro_event
>> agent2.sinks.loggerSink.serializer.compressionCodec = snappy
>> agent2.sinks.loggerSink.serializer.syncIntervalBytes = 2048000
>> agent2.channels.memoryChannel.type = memory
>>
>>
>> log4j.properties
>>
>> ------------------------------------------------------------------------------
>> log4j.rootLogger=INFO, CA, flume
>>
>> log4j.appender.CA=org.apache.log4j.ConsoleAppender
>>
>> log4j.appender.CA.layout=org.apache.log4j.PatternLayout
>> log4j.appender.CA.layout.ConversionPattern=%-4r [%t] %-5p %c %x - %m%n
>>
>> log4j.appender.flume =
>> org.apache.flume.clients.log4jappender.Log4jAppender
>> log4j.appender.flume.Hostname = localhost
>> log4j.appender.flume.Port = 41414
>>
>>
>> and my output:
>> ------------------------
>> Obj avro.codec null avro.schema�
>> {"type":"record","name":"Event","fields":[{"name":"headers","type":{"type":"map","values":"string"}},{"name":"body","type":"bytes"}]}�|
>> ��(r5��q ��nl � 8flume.client.log4j.log.level
>> 40000Fflume.client.log4j.message.encoding
>> UTF88flume.client.log4j.timestamp
>> 1343665387977<flume.client.log4j.logger.name2com.cisco.flume.FlumeTest�(Sample
>> error message| ��(r5��q ��nl � 8flume.client.log4j.log.level
>> 50000Fflume.client.log4j.message.encoding
>> UTF88flume.client.log4j.timestamp
>> 1343665387993<flume.client.log4j.logger.name2com.cisco.flume.FlumeTest�(Sample
>> fatal message| ��(r5��q ��nl � 8flume.client.log4j.log.level
>> 20000Fflume.client.log4j.message.encoding
>> UTF88flume.client.log4j.timestamp
>>
>>
>> Please let me know, if im in the wrong path.
>>
>> Please suggest me to get custom logging pattern (for example like in
>> log4j)
>>
>>
>> Thanks
>> JP
>>
>> On Sun, Jul 29, 2012 at 10:04 AM, Hari Shreedharan <
>> [email protected]> wrote:
>>
>>  + user@
>>
>> Thamatam,
>>
>> The Log4J appender adds the date, log level and logger name to the flume
>> event headers and the text of the log event to the flume event body. The
>> reason the log level and time are missing is that these are in the headers
>> and the text serializer does not serialize the headers.
>>
>> To write to a file or HDFS, please use a Serializer together with the
>> RollingFileSink or HDFSEventSink. Please take a look at the plain text
>> serializer or Avro serializer to understand this better.
>>
>> Thanks,
>> Hari
>>
>> --
>> Hari Shreedharan
>>
>> On Saturday, July 28, 2012 at 5:47 PM, thamatam Jayaprakash wrote:
>>
>> H
>>
>>
>>
>>
>>
>> --
>> JP
>>
>>
>>
>


-- 
JP

Reply via email to