Can you please post your log(flume.log)?

Thanks & Regards,
Ashutosh Sharma

From: prabhu k [mailto:prabhu.fl...@gmail.com]
Sent: Thursday, August 23, 2012 4:58 PM
To: user@flume.apache.org
Subject: Re: Unable to sink HDFS using tail source.

Hi Ashutosh,

Thanks for the response.

I have verified the specified HDFS location.but flume is not writing the data 
into the HDFS location. by using bin/hadoop fs -ls /user command.

 can you please clarify the below line in my flume.conf file

agent1.sinks.HDFS.hdfs.path = 
hdfs://10.10.12.100:54310/user/%{host}<http://10.10.12.100:54310/user/%25%7Bhost%7D>

Please let me know i would have missed anything?

Thanks,
Prabhu.

On Thu, Aug 23, 2012 at 1:17 PM, ashutosh(오픈플랫폼개발팀) 
<sharma.ashut...@kt.com<mailto:sharma.ashut...@kt.com>> wrote:
Hi Prabhu,

Please check, is flume storing data in specified HDFS location? If flume is 
storing the data then, it’s doing its job. You need not to worry about the 
flume behavior.
Thanks & Regards,
Ashutosh Sharma

From: prabhu k [mailto:prabhu.fl...@gmail.com<mailto:prabhu.fl...@gmail.com>]
Sent: Thursday, August 23, 2012 4:12 PM
To: user@flume.apache.org<mailto:user@flume.apache.org>
Subject: Re: Unable to sink HDFS using tail source.

can anyone help me on the below issue.early response i would appreciate.
On Wed, Aug 22, 2012 at 3:13 PM, prabhu k 
<prabhu.fl...@gmail.com<mailto:prabhu.fl...@gmail.com>> wrote:
Hi Users,

I have followed the below link 
http://cloudfront.blogspot.in/2012/06/how-to-use-host-escape-sequence-in.html 
for sample text file to HDFS sink using tail source.

and i have executed flume-ng like using below command. it seems got stuck. 
attached flume.conf file and running script log.

bin/flume-ng agent -n agent1 -c /conf -f conf/flume.conf

Please suggest and help me on this issue.

flume.conf
===========

agent1.sources = tail
agent1.channels = MemoryChannel-2
agent1.sinks = HDFS
agent1.sources.tail.type = exec
agent1.sources.tail.command = tail -F 
/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/flume_test.txt
agent1.sources.tail.channels = MemoryChannel-2
agent1.sources.tail.interceptors = hostint
agent1.sources.tail.interceptors.hostint.type = 
org.apache.flume.interceptor.HostInterceptor$Builder
agent1.sources.tail.interceptors.hostint.preserveExisting = true
agent1.sources.tail.interceptors.hostint.useIP = false
agent1.sinks.HDFS.channel = MemoryChannel-2
agent1.sinks.HDFS.type = hdfs
agent1.sinks.HDFS.hdfs.path = 
hdfs://10.10.12.100:54310/user/%{host}<http://10.10.12.100:54310/user/%25%7Bhost%7D>
agent1.sinks.HDFS.hdfs.file.Type = dataStream
agent1.sinks.HDFS.hdfs.writeFormat = text
agent1.channels.MemoryChannel-2.type = memory



script running output:

/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT# bin/flume-ng agent 
-n agent1 -c /conf -f conf/flume.conf
Info: Including Hadoop libraries found via 
(/usr/local/hadoop_dir/hadoop/bin/hadoop) for HDFS access
Info: Excluding 
/usr/local/hadoop_dir/hadoop/libexec/../lib/slf4j-api-1.4.3..jar from classpath
Info: Excluding 
/usr/local/hadoop_dir/hadoop/libexec/../lib/slf4j-log4j12-1..4.3.jar from 
classpath
+ exec /usr/lib/jvm/java-6-sun/bin/java -Xmx20m -cp 
'/conf:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/conf:/usr/local/hadoop_dir/hadoop/libexec/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/usr/local/hadoop_dir/hadoop/libexec/..:/usr/local/hadoop_dir/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop_dir/hadoop/libexec/.../lib/commons-cli-1.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop_dir/hadoop/libexec/.../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop_dir/hadoop/libexec/.../lib/commons-math-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0..3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-fairscheduler-1.0..3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop_dir/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar'
 
-Djava.library.path=/usr/local/flume_dir/flume/flume-1.2.0-incubating-SNAPSHOT/lib:/usr/local/hadoop_dir/hadoop/libexec/../lib/native/Linux-i386-32
 org.apache.flume.node.Application -n agent1 -f conf/flume.conf

12/08/22 15:01:08 INFO lifecycle.LifecycleSupervisor: Starting lifecycle 
supervisor 1
12/08/22 15:01:08 INFO node.FlumeNode: Flume node starting - agent1
12/08/22 15:01:08 INFO nodemanager.DefaultLogicalNodeManager: Node manager 
starting
12/08/22 15:01:08 INFO properties.PropertiesFileConfigurationProvider: 
Configuration provider starting
12/08/22 15:01:08 INFO lifecycle.LifecycleSupervisor: Starting lifecycle 
supervisor 10
12/08/22 15:01:08 INFO properties.PropertiesFileConfigurationProvider: 
Reloading configuration file:conf/flume.conf
12/08/22 15:01:08 INFO conf.FlumeConfiguration: Processing:HDFS
12/08/22 15:01:08 INFO conf.FlumeConfiguration: Processing:HDFS
12/08/22 15:01:08 INFO conf.FlumeConfiguration: Processing:HDFS
12/08/22 15:01:08 INFO conf.FlumeConfiguration: Processing:HDFS
12/08/22 15:01:08 INFO conf.FlumeConfiguration: Processing:HDFS
12/08/22 15:01:08 INFO conf.FlumeConfiguration: Added sinks: HDFS Agent: agent1
12/08/22 15:01:08 INFO conf.FlumeConfiguration: Post-validation flume 
configuration contains configuration  for agents: [agent1]
12/08/22 15:01:08 INFO properties.PropertiesFileConfigurationProvider: Creating 
channels
12/08/22 15:01:08 INFO properties.PropertiesFileConfigurationProvider: created 
channel MemoryChannel-2
12/08/22 15:01:08 INFO sink.DefaultSinkFactory: Creating instance of sink HDFS 
typehdfs

Thanks,
Prabhu.



이 메일은 지정된 수취인만을 위해 작성되었으며, 중요한 정보나 저작권을 포함하고 있을 수 있습니다. 어떠한 권한 없이, 본 문서에 포함된 
정보의 전부 또는 일부를 무단으로 제3자에게 공개, 배포, 복사 또는 사용하는 것을 엄격히 금지합니다. 만약, 본 메일이 잘못 전송된 경우, 
발신인 또는 당사에 알려주시고, 본 메일을 즉시 삭제하여 주시기 바랍니다.
This E-mail may contain confidential information and/or copyright material. 
This email is intended for the use of the addressee only. If you receive this 
email by mistake, please either delete it without reproducing, distributing or 
retaining copies thereof or notify the sender immediately.



이 메일은 지정된 수취인만을 위해 작성되었으며, 중요한 정보나 저작권을 포함하고 있을 수 있습니다. 어떠한 권한 없이, 본 문서에 포함된 
정보의 전부 또는 일부를 무단으로 제3자에게 공개, 배포, 복사 또는 사용하는 것을 엄격히 금지합니다. 만약, 본 메일이 잘못 전송된 경우, 
발신인 또는 당사에 알려주시고, 본 메일을 즉시 삭제하여 주시기 바랍니다.
This E-mail may contain confidential information and/or copyright material. 
This email is intended for the use of the addressee only. If you receive this 
email by mistake, please either delete it without reproducing, distributing or 
retaining copies thereof or notify the sender immediately.

Reply via email to