I'm unsure of your particular problem.

but the scripts/patch I referenced previously remove any dependency on DynDNS.

the recipe would be something like...

make a s3 bucket and update hadoop-ec2-env.sh

make an image:
> hadoop-ec2 create-image

make a 2 node (3 machine) cluster:
> hadoop-ec2 launch-cluster test-group 2

upload your jar file
> hadoop-ec2 push test-group /path/to/your.jar

login:
> hadoop-ec2 login test-group

run your jar
> hadoop jar your.jar

kill your cluster when done:
> hadoop-ec2 terminate-cluster test-group

test-group is the name of the ec2 hadoop cluster group. these scripts support many concurrent groups.

that all said, these scripts are untested with cygwin on windows.. work great on a mac though.

On Apr 15, 2008, at 8:47 PM, Prerna Manaktala wrote:
Hey
Can u solve my problem also.
I am badly stuck up

I tried to set up hadoop with cygwin according to the
paper:http://developer.amazonwebservices.com/connect/entry.jspa?externalID=873
But I had problems working with dyndns.I created a new host
there:prerna.dyndns.org
and gave the ip address of it in hadoop-ec2-env.sh as a value of MASTER_HOST.
But when I do bin/hadoop-ec2 start-hadoop the error comes:
ssh:connect to host prerna.dyndns.org,port 22:connection refused
ssh failed for [EMAIL PROTECTED]
Also a warning comes that:id_rsa_gsg-keypair not accesible:No such
file or directory though there is this file.

As of now the status is:
I am working with the EC2 environment.
I registered and am being billed for EC2 and S3.
Right now I have two cygwin windows open.
1 is as an administrator-server(on which sshd running) in which I have
a separate folder for hadoop files
and am able to do bin/hadoop
1 as  a normal user-client.
In the client there is no hadoop folder and hence I cant run bin/ hadoop.
From here I do not know how to proceed?
I basically want to implement
http://developer.amazonwebservices.com/connect/entry.jspa?externalID=873 .
Hence I created a host using dyndns.
If you can help me,it will be great.

On Tue, Apr 15, 2008 at 11:30 PM, Chris K Wensel <[EMAIL PROTECTED]> wrote:

it's pretty simple. just create an s3 bucket, note the bucket name in the
hadoop-ec2.sh file (with the other necessary values). then call

hadoop-ec2 create-image

i'll try and update the wiki soon.

ckw

On Apr 15, 2008, at 7:28 PM, Stephen J. Barr wrote:

Thank you. I will check that out. I haven't built an AMI before. Hopefully
it isn't too complicated, as it is easy to use the pre-built AMI's.

-stephen

Chris K Wensel wrote:

Stephen

Check out the patch in Hadoop-2410 to the contrib/ec2 scripts
https://issues.apache.org/jira/browse/HADOOP-2410
(just grab the ec2.tgz attachment)

these scripts allow you do dynamically grow your cluster plus some extra
goodies. you will need to use them to build your own ami, they are not
compatible with the hadoop-images amis.

using them should be self explanatory.

ckw


On Apr 15, 2008, at 7:03 PM, Stephen J. Barr wrote:

Hello,

Does anyone have any experience adding nodes to a cluster running on
EC2? If so, is there some documentation on how to do this?

Thanks,
-stephen


Chris K Wensel
[EMAIL PROTECTED]
http://chris.wensel.net/
http://www.cascading.org/







Chris K Wensel
[EMAIL PROTECTED]
http://chris.wensel.net/
http://www.cascading.org/






Chris K Wensel
[EMAIL PROTECTED]
http://chris.wensel.net/
http://www.cascading.org/




Reply via email to