You might have to export HADOOP_HOME and HADOOP_CONF_DIR as well.

E.g
  export HADOOP_HOME=/opt/hadoop/0.22-SNAPSHOT/
And export HADOOP_CONF_DIR=/opt/hadoop/0.22-SNAPSHOT/conf
  if that is where your hadoop-home and conf are.


On 4/8/10 4:03 AM, "Alan Miller" <alan.mil...@synopsys.com> wrote:

> Ok, great I¹d like to try the trunk, I got and compiled ³common² and ³hdfs²
> as follows but I can¹t get it to startup ("NoClassDefFoundError" see below).
> Am I missing something?
> 
>   svn co http://svn.apache.org/repos/asf/hadoop/hdfs/trunk
>   svn co http://svn.apache.org/repos/asf/hadoop/common/trunk
>   cd Š.
>   export ANT_HOME=/usr/share/ant
>   export JAVA_HOME=/opt/java/jdk1.6.0_10-i586
>   export PATH=$JAVA_HOME/bin:$PATH
>   ant -l build-compile.log \
>       -Dforrest.home=/usr/local/src/apache-forrest-0.8 \
>       -Djava5.home=/opt/java/jdk1.5.0_22-i586 \
>       package
> 
> Then I copied everything
>   from: /usr/local/src/hadoop/common/trunk/build/hadoop-hdfs-0.22.0-SNAPSHOT/*
>   and:   /usr/local/src/hadoop/hdfs/trunk/build/hadoop-core-0.22.0-SNAPSHOT/*
>   to:  /opt/hadoop/0.22-SNAPSHOT/
> 
> When I run start-dfs.sh I get ³java.lang.NoClassDefFoundError² errors:
> 
>   [r...@amiller-e6400 ~]# start-dfs.sh
>   starting namenode, logging to
> /opt/hadoop/0.22.0-SNAPSHOT/logs/hadoop-root-namenode-localhost.out
>   localhost: starting datanode, logging to
> /opt/hadoop/0.22.0-SNAPSHOT/logs/hadoop-root-datanode-localhost.out
>   localhost: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/hdfs/server/datanode/DataNode
>   localhost: Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.hdfs.server.datanode.DataNode
> 
> Regards,
> Alan

Reply via email to