I have been through this very recently.  My approach was to:

--manually setup the master (ie specify the conf files etc)
--tar-up java and hadoop s.t unpacking them puts them in the desired
location
--create the ssh keys on the master.

now, create a shell script which does the following:

--open the necessary ports
--copy across the ssh keys from the master and install them in the correct
location
--copy across and untar java and hadoop
--assign the correct permissions to the distributed file system directory on
the current node
--create user accounts as necessary

copy this script across to each slave in turn and run it;  adding a new
slave node will take a minute or two.

(this assumes each node already has linux installed on it and the filesystem
is identical)

Miles
On 15/01/2008, Bin YANG <[EMAIL PROTECTED]> wrote:
>
> Dear colleagues,
>
> Right now, I have to deploy ubuntu 7.10 + hadoop 0.15 on 16 PCs.
> One PC will be set as master, the others will be set as slaves.
> The PCs have similar hardware, or even the same hardware.
>
> Is there a quick and easy way to deploy hadoop on these PCs?
>
> Do you think that
>
> 1. ghost a whole successful ubuntu 7.10 + hadoop 0.15 hard disk
> 2. and then copy the image to other PCs
>
> is the best way?
>
> Thank you very much.
>
> Best wishes,
> Bin YANG
>
>
> --
> Bin YANG
> Department of Computer Science and Engineering
> Fudan University
> Shanghai, P. R. China
> EMail: [EMAIL PROTECTED]
>

Reply via email to