Hi guys,

I have a custom Demux that I need to run to process my input, but I'm getting ClassNotFoundException when running in Hadoop. This is with the released 0.4.0 build.

I've done the following:

  1. I put my Demux class in the correct package
     (org.apache.hadoop.chukwa.extraction.demux.processor.mapper)
  2. I've added the JAR containing the Demux implementation to HDFS at
     /chuka/demux
  3. I've added an alias to it in chukwa-demux-conf.xml


The map/reduce job is picking up on the fact that I have a custom Demux and is trying to load it, but I get a ClassNotFoundException. The HDFS-based URL to the JAR is showing up in the job configuration in Hadoop, which is another evidence that Chukwa and Hadoop know where the JAR lives and that it's part of the Chukwa-initiated job.

My Demux is very simple. I've stripped it down to a System.out.println with dependencies on no other classes/JARs other than Chukwa, Hadoop, and the core JDK. I've double-checked that my JAR is being built up correctly. I'm completely flummoxed as to what I'm doing wrong.

Any ideas what I'm missing? What other information can I provide?

Thanks!
Kirk

Reply via email to