Hi,
I am getting following error while running flume agent:
(I wanted to write data form source file to hdfs sink)
hduser@vm-ps7274:~/F1/apache-flume-1.2.0$ bin/flume-ng agent -c conf -f
conf/flumeHdfs.conf -Dflume.root.logger=DEBUGG,console -n agent1
Info: Sourcing environment configuration script
/home/hduser/F1/apache-flume-1.2.0/conf/flume-env.sh
+ exec /usr/local/java/jdk1.6.0_26/jre/bin/java -Xmx20m
-Dflume.root.logger=DEBUGG,console -cp
'/home/hduser/F1/apache-flume-1.2.0/conf:/home/hduser/F1/apache-flume-1.2.0/bin/lib/*'
-Djava.library.path= org.apache.flume.node.Application -f conf/flumeHdfs.conf
-n agent1
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/flume/node/Application
Caused by: java.lang.ClassNotFoundException: org.apache.flume.node.Application
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.flume.node.Application. Program will
exit.
1. Flume version is: apache-flume-1.2.0
2.Java version "1.6.0_26"
3.Hadoop version: apache Hadoop 1.0.3
Are there any missing files which are need to be download explicitly?
Please find below flume-env.sh and flumeHdfs.conf files:
1.Flume-env.sh:
# Enviroment variables can be set here.
# Note that the Flume conf directory is always included in the classpath.
#FLUME_CLASSPATH=""
export FLUME_HOME=/home/hduser/F1/apache-flume-1.2.0/bin
export JAVA_HOME=/usr/local/java/jdk1.6.0_26/jre/bin/java
export FLUME_CONF_DIR=/home/hduser/F1/apache-flume-1.2.0/conf/
export PATH=$JAVA_HOME:$FLUME_HOME/bin:$PATH
2.flumeHdfs.conf:
# Define a memory channel called ch1 on agent1
agent1.channels.ch1.type = memory
# Define an EXEC source called src on agent1 and connect it to channel ch1.
agent1.sources.src.channels = ch1
agent1.sources.src.type = exec
agent1.sources.src.command = tail -F /home/hduser/F1/h.txt
# Define a HDFS sink and connect it to the other end of the same channel.
agent1.sinks.HDFS.channel = ch1
agent1.sinks.HDFS.type = hdfs
agent1.sinks.HDFS.hdfs.path = hdfs://localhost:8020/user/hduser
agent1.sinks.HDFS.hdfs.fileType = DataStream
agent1.sinks.HDFS.hdfs.writeFormat = Text
agent1.sinks.HDFS.hdfs.filePrefix = FlumeTest
# Finally, now that we've defined all of our components, tell
# agent1 which ones we want to activate.
agent1.channels = ch1
agent1.sources = src
agent1.sinks = HDFS
Thanks and Regards,
Swati
DISCLAIMER
==========
This e-mail may contain privileged and confidential information which is the
property of Persistent Systems Ltd. It is intended only for the use of the
individual or entity to which it is addressed. If you are not the intended
recipient, you are not authorized to read, retain, copy, print, distribute or
use this message. If you have received this communication in error, please
notify the sender and delete all copies of this message. Persistent Systems
Ltd. does not accept any liability for virus infected mails.