Maybe there were some old CDH3 .jar files left around from the previous run. Setting :default_distro to "cdh4" should have solved the problem. I think when you change distros (e.g. change :default_distro from "cdh3" to "cdh4") you need to run the following command:
cap set_distro - Doug On Mon, Dec 23, 2013 at 1:08 AM, Xy Zheng <[email protected]> wrote: > [cloudil@dlxa101 bin]$ ./set-hadoop-distro.sh cdh4 > when i did above command, the error fixed. > but i don't know why? > > when cap install_package. the config items as follows: > set :default_dfs, "hadoop" > set :default_distro, "cdh4" > > be set in Capfile. > i want to ./set-hadoop-distro.sh be called when and by whom? > is it set automatically? > why should i manual set? > > my Capfile set part as follows: > [cloudil@dlxa101 zxy_hytcluster_test]$ head -n20 Capfile > set :source_machine, "dlxa101" > set :install_dir, "/home/cloudil/zxy_hytcluster_test/hypertable_cluster" > set :hypertable_version, "0.9.7.8" > set :default_pkg, > "/home/cloudil/zxy_hytcluster_test/hypertable-0.9.7.8-linux-x86_64-debug.tar.bz2" > set :default_dfs, "hadoop" > set :default_distro, "cdh4" > set :default_config, "/home/cloudil/zxy_hytcluster_test/hypertable.cfg" > > role :source, "dlxa101" > role :master, "dlxa101" > role :hyperspace, "dlxa102" > role :slave, "dlxa103", "dlxa105", "dlxa106", "dlxa107" > role :localhost, "dlxa101" > > ######################### END OF USER CONFIGURATION > ############################ > set :prompt_stop, 0 > set :prompt_clean, 1 > > any advice be appreciated. > > 在 2013年12月23日星期一UTC+8下午3时22分51秒,Xy Zheng写道: > >> Exception in thread "main" java.lang.NoClassDefFoundError: >> org/apache/hadoop/hdfs/server/namenode/NotReplicatedYetException >> at org.hypertable.DfsBroker.hadoop.main.main(main.java:171) >> Caused by: java.lang.ClassNotFoundException: >> org.apache.hadoop.hdfs.server.namenode.NotReplicatedYetException >> at java.net.URLClassLoader$1.run(URLClassLoader.java:202) >> at java.security.AccessController.doPrivileged(Native Method) >> at java.net.URLClassLoader.findClass(URLClassLoader.java:190) >> at java.lang.ClassLoader.loadClass(ClassLoader.java:306) >> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) >> at java.lang.ClassLoader.loadClass(ClassLoader.java:247) >> >> what is the reason of this error arise? i have put *.jar under correct >> path. any advice be appreciated. >> >> 在 2013年12月13日星期五UTC+8下午2时43分30秒,Xy Zheng写道: >>> >>> fixed error: /current/lib/*Java* >>> >>> 在 2013年12月13日星期五UTC+8上午11时13分54秒,Xy Zheng写道: >>>> >>>> I think this error arised because maybe there is no jar file ( >>>> hypertable-0.9.7.X(-examples).jar) in /current/lib and >>>> /current/lib/java/cdhX >>>> >>>> you must be compile it and copy to the path mentioned. >>>> >>>> how to compile it, you can find method at https://groups.google.com/ >>>> forum/#!topic/hypertable-dev/EqQwB7ls9JU >>>> >>>> wow!!! >>>> >>>> 在 2013年12月12日星期四UTC+8下午2时08分51秒,Xy Zheng写道: >>>>> >>>>> Hi Doug: >>>>> >>>>> cap start error as follows: >>>>> >>>>> servers: ["dlxa101"] >>>>> [dlxa101] executing command >>>>> ** [out :: dlxa101] DFS broker: available file descriptors: 65536 >>>>> ** [out :: dlxa101] Waiting for DFS Broker (hadoop) (localhost:38030) >>>>> to come up... >>>>> ** [out :: dlxa101] Waiting for DFS Broker (hadoop) (localhost:38030) >>>>> to come up... >>>>> ** [out :: dlxa101] Waiting for DFS Broker (hadoop) (localhost:38030) >>>>> to come up... >>>>> ** [out :: dlxa101] Waiting for DFS Broker (hadoop) (localhost:38030) >>>>> to come up... >>>>> ** [out :: dlxa101] Waiting for DFS Broker (hadoop) (localhost:38030) >>>>> to come up... >>>>> ** [out :: dlxa101] Waiting for DFS Broker (hadoop) (localhost:38030) >>>>> to come up... >>>>> ** [out :: dlxa101] Waiting for DFS Broker (hadoop) (localhost:38030) >>>>> to come up... >>>>> ** [out :: dlxa101] Waiting for DFS Broker (hadoop) (localhost:38030) >>>>> to come up... >>>>> ** [out :: dlxa101] ERROR: DFS Broker (hadoop) did not come up >>>>> >>>>> [cloudil@dlxa101 bin]$ ./set-hadoop-distro.sh cdh4 >>>>> Hypertable successfully configured for Hadoop cdh4 >>>>> >>>>> DfsBroker.hadoop.log as follows: >>>>> >>>>> [cloudil@dlxa101 log]$ tail -f -n20 DfsBroker.hadoop.log >>>>> No Hadoop distro is configured. Run the following script to >>>>> configure: >>>>> >>>>> /home/cloudil/zxy_hytcluster_test/hypertable_cluster/ >>>>> current/bin/set-hadoop-distro.sh >>>>> Hypertable successfully configured for Hadoop cdh4 >>>>> Exception in thread "main" java.lang.NoClassDefFoundError: >>>>> org/hypertable/DfsBroker/hadoop/main >>>>> Caused by: java.lang.ClassNotFoundException: org.hypertable.DfsBroker. >>>>> hadoop.main >>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:202) >>>>> at java.security.AccessController.doPrivileged(Native Method) >>>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:190) >>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:306) >>>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) >>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:247) >>>>> Could not find the main class: org.hypertable.DfsBroker.hadoop.main. >>>>> Program will exit. >>>>> >>>>> can you help me. thank you. >>>>> >>>> -- > You received this message because you are subscribed to the Google Groups > "Hypertable Development" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to [email protected]. > To post to this group, send email to [email protected]. > Visit this group at http://groups.google.com/group/hypertable-dev. > For more options, visit https://groups.google.com/groups/opt_out. > -- Doug Judd CEO, Hypertable Inc. -- You received this message because you are subscribed to the Google Groups "Hypertable Development" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at http://groups.google.com/group/hypertable-dev. For more options, visit https://groups.google.com/groups/opt_out.
