Not sure if it will work, but you could try starting the daemon script
locally on each box in your slaves file.

On 11/4/06, howard chen <[EMAIL PROTECTED]> wrote:

On 11/4/06, Lee <[EMAIL PROTECTED]> wrote:
> You need passwordless ssh setup for the username you start the script
with.
>
> Lee
>
> On 11/4/06, howard chen <[EMAIL PROTECTED]> wrote:
> >
> > Hi
> >
> > Currently I have 3 servers, A, B, C
> >
> > 1.
> >
> > I unpacked Hadoop separately on three machines on the same folder
(local):
> >
> > /home/hadoop/
> >
> > 2.
> >
> > I follow the documentation, set up the JAVA_HOME path, and created a
> > config folder, on a NFS mounted drive, move the hadoop-env.sh,
> > hadoop-site.xml  & slaves to this folder
> >
> > /data-0/hadoop_conf/
> >
> > 3.
> >
> > in the hadoop_conf/slaves, i remove the localhost, but add the 3
server's
> > IP
> >
> > i.e.
> > serverA
> > serverB
> > serverC
> >
> >
> > 4.
> >
> > When I type (on serverA): ./start-all.sh --config /data-0/hadoop_conf/
> >
> > It prompt me to enter password for server A, B, C, but when I typed a
> > password, I got welcome message  from serverA, but I have no way to
> > enter password for B & C, console stopped here...what can I do?
> >
> > Thanks.
> >
>
>

if I my system don't allow passwordless ssh, are there any way workaround?

thanks

Reply via email to