How are you sending data to the Avro Source?
Thanks, Hari On Wed, May 13, 2015 at 7:38 PM, 鹰 <[email protected]> wrote: > hi all , > i'm want set flume send data to hdfs my configure file is lile this : > tier1.sources=source1 > tier1.channels=channel1 > tier1.sinks=sink1 > > tier1.sources.source1.type=avro > tier1.sources.source1.bind=0.0.0.0 > tier1.sources.source1.port=44444 > tier1.sources.source1.channels=channel1 > > tier1.channels.channel1.type=memory > tier1.channels.channel1.capacity=10000 > tier1.channels.channel1.transactionCapacity=1000 > tier1.channels.channel1.keep-alive=30 > > tier1.sinks.sink1.type=hdfs > tier1.sinks.sink1.channel=channel1 > tier1.sinks.sink1.hdfs.path=hdfs://hadoop-home.com:9000/user/hadoop/ > tier1.sinks.sink1.hdfs.fileType=DataStream > tier1.sinks.sink1.hdfs.writeFormat=Text > tier1.sinks.sink1.hdfs.rollInterval=0 > tier1.sinks.sink1.hdfs.rollSize=10240 > tier1.sinks.sink1.hdfs.rollCount=0 > tier1.sinks.sink1.hdfs.idleTimeout=60 > > when I start the flume by this configure file and send data to the port > 44444 I get an error : > org.apache.avro.AvroRuntimeException: Excessively large list allocation > request detected: 154218761 items! Connection closed; > dose anybody can help me ,thanks. >
