Im able to see the human redable format.

The following is the output:

20000    UTF8    1344001730307    com.cisco.flume.FlumeTest    Sample info
message
30000    UTF8    1344001730432    com.cisco.flume.FlumeTest    Sample warn
message
40000    UTF8    1344001730463    com.cisco.flume.FlumeTest    Sample error
message
50000    UTF8    1344001730479    com.cisco.flume.FlumeTest    Sample fatal
message


i want the following logging pattern like bellow:

*%d \=\=\=\= thread\: %t \=\=\=\= %-8p>%n%c.%M() \=> %x (line\:
%L)%n\t%m%n%n*

*what changes required , to get this pattern work*

Thanks
JP


On Mon, Jul 30, 2012 at 11:52 PM, Hari Shreedharan <
[email protected]> wrote:

 You are using the AvroEventSerializer. This formats the event into Avro
format specified by
org.apache.flume.serialization.FlumeEventAvroEventSerializer,
which is why it looks like garbage, while it is not. Your app should be
written to read and understand the Avro format. If you need it to human
readable, you will need to write your own serializer, perhaps by extending
the BodyTextEventSerializer.

Thanks
Hari

-- 
Hari Shreedharan

 On Monday, July 30, 2012 at 9:34 AM, JP wrote:

 Thanks Hari ,

i got little progress.

But im getting garbage values.

this is my configurations:

*flume-conf.properties*
---------------------------------------
agent2.sources = seqGenSrc
agent2.channels = memoryChannel
agent2.sinks = loggerSink

agent2.sources.seqGenSrc.type = avro
agent2.sources.seqGenSrc.bind=localhost
agent2.sources.seqGenSrc.port=41414

agent2.channels.memoryChannel.type = memory
agent2.channels.memoryChannel.capacity = 1000000
agent2.channels.memoryChannel.transactionCapacity = 1000000
agent2.channels.memoryChannel.keep-alive = 30

agent2.sources.seqGenSrc.channels = memoryChannel

agent2.sinks.loggerSink.type = hdfs
agent2.sinks.loggerSink.hdfs.path = hdfs://ip:portno/data/CspcLogs
agent2.sinks.loggerSink.hdfs.fileType = DataStream
agent2.sinks.loggerSink.channel = memoryChannel
agent2.sinks.loggerSink.serializer = avro_event
agent2.sinks.loggerSink.serializer.compressionCodec = snappy
agent2.sinks.loggerSink.serializer.syncIntervalBytes = 2048000
agent2.channels.memoryChannel.type = memory


log4j.properties
------------------------------------------------------------------------------
log4j.rootLogger=INFO, CA, flume

log4j.appender.CA <http://log4j.appender.ca/>
=org.apache.log4j.ConsoleAppender

log4j.appender.CA.layout=org.apache.log4j.PatternLayout
log4j.appender.CA.layout.ConversionPattern=%-4r [%t] %-5p %c %x - %m%n

log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender
log4j.appender.flume.Hostname = localhost
log4j.appender.flume.Port = 41414


and my output:
------------------------
Obj avro.codec null avro.schema�
{"type":"record","name":"Event","fields":[{"name":"headers","type":{"type":"map","values":"string"}},{"name":"body","type":"bytes"}]}�|
��(r5��q ��nl � 8flume.client.log4j.log.level
40000Fflume.client.log4j.message.encoding UTF88flume.client.log4j.timestamp
1343665387977<flume.client.log4j.logger.name2com.cisco.flume.FlumeTest�(Sample
error message| ��(r5��q ��nl � 8flume.client.log4j.log.level
50000Fflume.client.log4j.message.encoding UTF88flume.client.log4j.timestamp
1343665387993<flume.client.log4j.logger.name2com.cisco.flume.FlumeTest�(Sample
fatal message| ��(r5��q ��nl � 8flume.client.log4j.log.level
20000Fflume.client.log4j.message.encoding UTF88flume.client.log4j.timestamp


Please let me know, if im in the wrong path.

Please suggest me to get custom logging pattern (for example like in log4j)


Thanks
JP

On Sun, Jul 29, 2012 at 10:04 AM, Hari Shreedharan <
[email protected]> wrote:

 + user@

Thamatam,

The Log4J appender adds the date, log level and logger name to the flume
event headers and the text of the log event to the flume event body. The
reason the log level and time are missing is that these are in the headers
and the text serializer does not serialize the headers.

To write to a file or HDFS, please use a Serializer together with the
RollingFileSink or HDFSEventSink. Please take a look at the plain text
serializer or Avro serializer to understand this better.

Thanks,
Hari

-- 
Hari Shreedharan

 On Saturday, July 28, 2012 at 5:47 PM, thamatam Jayaprakash wrote:

  Hi Hari,


Actually im unable to send to this mail to the user and dev group so, im
mailing to you .

Could you pls point me where im going wrong.
*Please suggest me which log appender need to use for custom logging
pattern and appender.*

Im working on Flume 1.1.0 and 1.2.0 . We are not able set log pattern and
We are using log4jappender
log4j.appender.flume=org.apache.flume.clients.log4jappender.Log4jAppender

but we are getting plain test

*Example if i log following mssages :*

17:42:55,928  INFO SimpleJdbcServlet:69 - doGet of SimpleJdbcServlet
ended...
17:43:03,489  INFO HelloServlet:29 - HelloServlet of doGet started...
17:43:03,489  INFO HelloServlet:33 -
 Hello from Simple Servlet
17:43:03,489  INFO HelloServlet:35 - HelloServlet of doGet end...
17:47:46,000  INFO HelloServlet:29 - HelloServlet of doGet started...
17:47:46,001  INFO HelloServlet:33 -
 Hello from Simple Servlet
17:47:46,001  INFO HelloServlet:35 - HelloServlet of doGet end...

*Using Flume in Hadoop im getting only the following logs:*

doGet of SimpleJdbcServlet ended...
HelloServlet of doGet started...

HelloServlet of doGet end...
HelloServlet of doGet started...

*

Thanks in advance.
*
-- 
JP



-- 
Jayaprakash





-- 
JP





-- 
JP



-- 
JP





-- 
JP

Reply via email to