Hi Flummers,

I copied this code from on-line to do experiments with flume,

a1.sources = r1
a1.sinks =  k1 k2
a1.channels = c1

# Describe/configure the source

a1.sources.r1.type = exec
a1.sources.r1.shell = /bin/bash -c
a1.sources.r1.command = cat /Users/david/flume/testdata.txt
a1.sources.r1.interceptors = a
a1.sources.r1.interceptors.a.type =
org.apache.flume.interceptor.TimestampInterceptor$Builder

# Describe the sinks
a1.sinks.k1.type = logger

a1.sinks.k2.type = hdfs
a1.sinks.k2.channel = c1
a1.sinks.k2.hdfs.fileType = DataStream
a1.sinks.k2.hdfs.batchSize = 2000
a1.sinks.k2.hdfs.rollCount = 5000
a1.sinks.k2.hdfs.rollSize = 1000000
a1.sinks.k2.hdfs.rollInterval = 10
a1.sinks.k2.hdfs.writeFormat = Text
# a1.sinks.k2.hdfs.path = /flume
a1.sinks.k2.hdfs.path = /flume/trades/%y-%m-%d/%H%M/%S

# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1
a1.sinks.k2.channel = c1



the logger sink is working fine but for the hdfs sink, it gives the
following error

process failed
This is supposed to be overridden by subclasses,

what could be the problem, please help me.

-- 
Thanks,
Kishore.

Reply via email to