Author: cutting Date: Wed Feb 15 16:56:38 2006 New Revision: 378131 URL: http://svn.apache.org/viewcvs?rev=378131&view=rev Log: Updated javadoc for recent config changes.
Modified: lucene/hadoop/trunk/src/java/overview.html Modified: lucene/hadoop/trunk/src/java/overview.html URL: http://svn.apache.org/viewcvs/lucene/hadoop/trunk/src/java/overview.html?rev=378131&r1=378130&r2=378131&view=diff ============================================================================== --- lucene/hadoop/trunk/src/java/overview.html (original) +++ lucene/hadoop/trunk/src/java/overview.html Wed Feb 15 16:56:38 2006 @@ -48,6 +48,9 @@ href="http://lucene.apache.org/hadoop/version_control.html">subversion</a> and build it with <a href="http://ant.apache.org/">Ant</a>.</p> +<p>Edit the file <tt>conf/hadoop-env.sh</tt> to define at least +<tt>JAVA_HOME</tt>.</p> + <p>Try the following command:</p> <tt>bin/hadoop</tt> <p>This will display the documentation for the Hadoop command script.</p> @@ -91,7 +94,7 @@ </li> <li>A <em>slaves</em> file that lists the names of all the hosts in -the cluster. The default slaves file is <tt>~/.slaves</tt>. +the cluster. The default slaves file is <tt>conf/slaves</tt>. </ol> @@ -129,10 +132,6 @@ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys </tt></p> -<p>Finally, you can create a <tt>.slaves</tt> file with the command:</p> - -<p><tt>echo localhost > ~/.slaves</tt></p> - <h3>Bootstrapping</h3> <p>The Hadoop daemons are started with the following command:</p> @@ -199,7 +198,7 @@ number of slave processors for <tt>mapred.reduce.tasks</tt>.</li> <li>List all slave hostnames or IP addresses in your -<tt>~/.slaves</tt> file, one per line.</li> +<tt>conf/slaves</tt> file, one per line.</li> </ol>