Thanks for the tip. This detail seems to have been omitted from http://www.cs.berkeley.edu/~asrabkin/chukwa/admin.html and http://wiki.apache.org/hadoop/Chukwa_Console_Integration_Guide. So I copied chukwa-0.4.0/chukwa-hadoop-0.4.0-client.jar (the closest name match I could get for chukwa-client-<xx>.jar) to all the hadoop/lib directories in my cluster and copied log4j.properties and hadoop-metrics.properties to all the hadoop/conf directories in my cluster. But now when I reformat the namenode I get a different error:
n...@hadoop1:~/hadoop-0.20.2$ bin/hadoop namenode -format 10/05/27 12:45:17 INFO namenode.NameNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting NameNode STARTUP_MSG: host = hadoop1/10.64.147.2 STARTUP_MSG: args = [-format] STARTUP_MSG: version = 0.20.2 STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20 -r 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010 ************************************************************/ Re-format filesystem in /tmp/hadoop-ngc/dfs/name ? (Y or N) Y 10/05/27 12:45:20 INFO namenode.FSNamesystem: fsOwner=ngc,ngc,adm,dialout,cdrom,plugdev,admin,lpadmin,sambashare 10/05/27 12:45:20 INFO namenode.FSNamesystem: supergroup=supergroup 10/05/27 12:45:20 INFO namenode.FSNamesystem: isPermissionEnabled=true 10/05/27 12:45:20 ERROR namenode.NameNode: java.lang.NoClassDefFoundError: org/json/JSONException at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:169) at org.apache.hadoop.metrics.ContextFactory.getContext(ContextFactory.java: 132) at org.apache.hadoop.metrics.MetricsUtil.getContext(MetricsUtil.java:56) at org.apache.hadoop.metrics.MetricsUtil.getContext(MetricsUtil.java:45) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.initialize(FSDirector y.java:72) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.<init>(FSDirectory.ja va:68) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem. java:379) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:854 ) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode. java:948) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:965) Caused by: java.lang.ClassNotFoundException: org.json.JSONException at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:307) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:248) ... 11 more 10/05/27 12:45:20 INFO namenode.NameNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down NameNode at hadoop1/10.64.147.2 ************************************************************/ -----Original Message----- From: Jerome Boulon [mailto:jbou...@netflix.com] Sent: Thursday, May 27, 2010 11:59 AM To: chukwa-user@hadoop.apache.org Subject: Re: Chukwa Installation Problems Hi, Doing ../bin/hadoop namenode -format starts a brand new java program so that's why Chukwa gets loaded. The java.lang.ClassNotFoundException is because you haven't put chukwa-client-<xx>.jar into hadoop/lib directory. The chukwa-client-<xx>.jar should be available on all hadoop nodes. Regards, /Jerome. On 5/27/10 5:14 AM, "Ratner, Alan S (IS)" <alan.rat...@ngc.com> wrote: > The first steps in the Chukwa Administration Guide are to replace > Hadoop's log4j.properties and hadoop-metrics.properties with the ones > found in the Chukwa. Presumably this change does not take effect until > I restart Hadoop. But with these files in Hadoop/conf Hadoop no longer > works - I cannot even format the Hadoop's namenode without errors. Is > there some sequence information I am overlooking. For example, do I > start Chukwa before I start Hadoop? > > > n...@hadoop1:~/hadoop-0.20.2/conf$ ../bin/hadoop namenode -format > log4j:ERROR Could not instantiate class > [org.apache.hadoop.chukwa.inputtools.log4j.ChukwaDailyRollingFileAppende > r]. > java.lang.ClassNotFoundException: > org.apache.hadoop.chukwa.inputtools.log4j.ChukwaDailyRollingFileAppender > at java.net.URLClassLoader$1.run(URLClassLoader.java:202) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:190) > at java.lang.ClassLoader.loadClass(ClassLoader.java:307) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) > at java.lang.ClassLoader.loadClass(ClassLoader.java:248) > at java.lang.Class.forName0(Native Method) > at java.lang.Class.forName(Class.java:169) > at org.apache.log4j.helpers.Loader.loadClass(Loader.java:179) > at > org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionCo > nverter.java:320) > at > org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverte > r.java:121) > at > org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator > .java:664) > at > org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator > .java:647) > at > org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConf > igurator.java:568) > at > org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.j > ava:442) > at > org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.j > ava:476) > at > org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConver > ter.java:471) > at org.apache.log4j.LogManager.<clinit>(LogManager.java:125) > at org.apache.log4j.Logger.getLogger(Logger.java:105) > at > org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java:2 > 29) > at > org.apache.commons.logging.impl.Log4JLogger.<init>(Log4JLogger.java:65) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorA > ccessorImpl.java:39) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingCons > tructorAccessorImpl.java:27) > at > java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at > org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImp > l.java:529) > at > org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImp > l.java:235) > at > org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImp > l.java:209) > at > org.apache.commons.logging.LogFactory.getLog(LogFactory.java:351) > at > org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:139) > at > org.apache.hadoop.hdfs.server.namenode.NameNode.<clinit>(NameNode.java:1 > 01) > log4j:ERROR Could not instantiate appender named "MR_CLIENTTRACE". > log4j:ERROR Could not instantiate class > [org.apache.hadoop.chukwa.inputtools.log4j.ChukwaDailyRollingFileAppende > r]. > java.lang.ClassNotFoundException: > org.apache.hadoop.chukwa.inputtools.log4j.ChukwaDailyRollingFileAppender > at java.net.URLClassLoader$1.run(URLClassLoader.java:202) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:190) > at java.lang.ClassLoader.loadClass(ClassLoader.java:307) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) > at java.lang.ClassLoader.loadClass(ClassLoader.java:248) > at java.lang.Class.forName0(Native Method) > at java.lang.Class.forName(Class.java:169) > at org.apache.log4j.helpers.Loader.loadClass(Loader.java:179) > at > org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionCo > nverter.java:320) > at > org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverte > r.java:121) > at > org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator > .java:664) > at > org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator > .java:647) > at > org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConf > igurator.java:568) > at > org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.j > ava:442) > at > org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.j > ava:476) > at > org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConver > ter.java:471) > at org.apache.log4j.LogManager.<clinit>(LogManager.java:125) > at org.apache.log4j.Logger.getLogger(Logger.java:105) > at > org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java:2 > 29) > at > org.apache.commons.logging.impl.Log4JLogger.<init>(Log4JLogger.java:65) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorA > ccessorImpl.java:39) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingCons > tructorAccessorImpl.java:27) > at > java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at > org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImp > l.java:529) > at > org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImp > l.java:235) > at > org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImp > l.java:209) > at > org.apache.commons.logging.LogFactory.getLog(LogFactory.java:351) > at > org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:139) > at > org.apache.hadoop.hdfs.server.namenode.NameNode.<clinit>(NameNode.java:1 > 01) > log4j:ERROR Could not instantiate appender named "DRFAAUDIT". > log4j:ERROR Could not instantiate class > [org.apache.hadoop.chukwa.inputtools.log4j.ChukwaDailyRollingFileAppende > r]. > java.lang.ClassNotFoundException: > org.apache.hadoop.chukwa.inputtools.log4j.ChukwaDailyRollingFileAppender > at java.net.URLClassLoader$1.run(URLClassLoader.java:202) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:190) > at java.lang.ClassLoader.loadClass(ClassLoader.java:307) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) > at java.lang.ClassLoader.loadClass(ClassLoader.java:248) > at java.lang.Class.forName0(Native Method) > at java.lang.Class.forName(Class.java:169) > at org.apache.log4j.helpers.Loader.loadClass(Loader.java:179) > at > org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionCo > nverter.java:320) > at > org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverte > r.java:121) > at > org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator > .java:664) > at > org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator > .java:647) > at > org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConf > igurator.java:568) > at > org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.j > ava:442) > at > org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.j > ava:476) > at > org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConver > ter.java:471) > at org.apache.log4j.LogManager.<clinit>(LogManager.java:125) > at org.apache.log4j.Logger.getLogger(Logger.java:105) > at > org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java:2 > 29) > at > org.apache.commons.logging.impl.Log4JLogger.<init>(Log4JLogger.java:65) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorA > ccessorImpl.java:39) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingCons > tructorAccessorImpl.java:27) > at > java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at > org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImp > l.java:529) > at > org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImp > l.java:235) > at > org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImp > l.java:209) > at > org.apache.commons.logging.LogFactory.getLog(LogFactory.java:351) > at > org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:139) > at > org.apache.hadoop.hdfs.server.namenode.NameNode.<clinit>(NameNode.java:1 > 01) > log4j:ERROR Could not instantiate appender named "HDFS_CLIENTTRACE". > 10/05/27 08:00:08 INFO namenode.NameNode: STARTUP_MSG: > /************************************************************ > STARTUP_MSG: Starting NameNode > STARTUP_MSG: host = hadoop1/10.64.147.2 > STARTUP_MSG: args = [-format] > STARTUP_MSG: version = 0.20.2 > STARTUP_MSG: build = > https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20 -r > 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010 > ************************************************************/ > Re-format filesystem in /tmp/hadoop-ngc/dfs/name ? (Y or N) Y > 10/05/27 08:01:05 INFO namenode.FSNamesystem: > fsOwner=ngc,ngc,adm,dialout,cdrom,plugdev,admin,lpadmin,sambashare > 10/05/27 08:01:05 INFO namenode.FSNamesystem: supergroup=supergroup > 10/05/27 08:01:05 INFO namenode.FSNamesystem: isPermissionEnabled=true > 10/05/27 08:01:05 ERROR metrics.MetricsUtil: Unable to create metrics > context dfs > java.lang.ClassNotFoundException: > org.apache.hadoop.chukwa.inputtools.log4j.Log4JMetricsContext > at java.net.URLClassLoader$1.run(URLClassLoader.java:202) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:190) > at java.lang.ClassLoader.loadClass(ClassLoader.java:307) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) > at java.lang.ClassLoader.loadClass(ClassLoader.java:248) > at java.lang.Class.forName0(Native Method) > at java.lang.Class.forName(Class.java:169) > at > org.apache.hadoop.metrics.ContextFactory.getContext(ContextFactory.java: > 132) > at > org.apache.hadoop.metrics.MetricsUtil.getContext(MetricsUtil.java:56) > at > org.apache.hadoop.metrics.MetricsUtil.getContext(MetricsUtil.java:45) > at > org.apache.hadoop.hdfs.server.namenode.FSDirectory.initialize(FSDirector > y.java:72) > at > org.apache.hadoop.hdfs.server.namenode.FSDirectory.<init>(FSDirectory.ja > va:68) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem. > java:379) > at > org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:854 > ) > at > org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode. > java:948) > at > org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:965) > 10/05/27 08:01:05 INFO common.Storage: Image file of size 93 saved in 0 > seconds. > 10/05/27 08:01:05 INFO common.Storage: Storage directory > /tmp/hadoop-ngc/dfs/name has been successfully formatted. > 10/05/27 08:01:05 INFO namenode.NameNode: SHUTDOWN_MSG: > /************************************************************ > SHUTDOWN_MSG: Shutting down NameNode at hadoop1/10.64.147.2 > ************************************************************/ >