Dear Wiki user, You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.
The following page has been changed by allenday: http://wiki.apache.org/hadoop/Hbase/MapReduce ------------------------------------------------------------------------------ !MapReduce jobs deployed to a mapreduce cluster do not by default have access to the hbase configuration under ''$HBASE_CONF_DIR'' nor to hbase classes. - You could add ''hbase-site.xml'' to $HADOOP_HOME/conf and add hbase.jar to the $HADOOP_HOME/lib and copy these changes across your cluster but the cleanest means of adding hbase configuration and classes to the cluster CLASSPATH is by uncommenting ''HADOOP_CLASSPATH'' in ''$HADOOP_HOME/conf/hadoop-env.sh'' and adding the path to the hbase jar and ''$HBASE_CONF_DIR'' directory. Then copy the amended configuration around the cluster. You'll probably need to restart the mapreduce cluster if you want it to notice the new configuration (You may not have to). + You could add ''hbase-site.xml'' to $HADOOP_HOME/conf and add hbase.jar to the $HADOOP_HOME/lib and copy these changes across your cluster but the cleanest means of adding hbase configuration and classes to the cluster ''CLASSPATH'' is by uncommenting ''HADOOP_CLASSPATH'' in ''$HADOOP_HOME/conf/hadoop-env.sh'' and adding the path to the hbase jar and ''$HBASE_CONF_DIR'' directory. Then copy the amended configuration around the cluster. You'll probably need to restart the mapreduce cluster if you want it to notice the new configuration (You may not have to). - For example, here is how you would amend ''hadoop-env.sh'' adding the hbase jar, conf, and the !PerformanceEvaluation class from hbase test classes to the hadoop ''CLASSPATH'': + Below, find an example of how you would amend ''hadoop-env.sh'' adding the hbase jar, conf, and the !PerformanceEvaluation class from hbase test classes to the hadoop ''CLASSPATH''. This example assumes you are using the hbase-0.2.0 release, with additional commented export commands for other releases/builds: + {{{ {{{# Extra Java CLASSPATH elements. Optional. # export HADOOP_CLASSPATH= + # for hbase-0.2.0 release + export HADOOP_CLASSPATH=$HBASE_HOME/build/hbase-0.2.0.jar:$HBASE_HOME/conf}}} + # for 0.16.0 release + #export HADOOP_CLASSPATH=$HBASE_HOME/hadoop-0.16.0-hbase.jar:$HBASE_HOME/conf}}} + # for 0.16.0 developer build - export HADOOP_CLASSPATH=$HBASE_HOME/build/test:$HBASE_HOME/build/hadoop-0.16.0-hbase.jar:$HBASE_HOME/conf}}} + #export HADOOP_CLASSPATH=$HBASE_HOME/build/test:$HBASE_HOME/build/hadoop-0.16.0-hbase.jar:$HBASE_HOME/conf}}} + }}} - Expand $HBASE_HOME appropriately in the in accordance with your local environment + Expand $HBASE_HOME appropriately in the in accordance with your local environment. To use the developer versions of the HADOOP_CLASSPATH, you first need to execute "ant" or "ant build" in $HBASE_HOME. This will create some .jar files in the $HBASE_HOME/build directory. And then, this is how you would run the PerformanceEvaluation MR job to put up 4 clients:
