you do need the whole ec2 tree for the scripts to work...
On May 7, 2008, at 12:25 PM, Jim R. Wilson wrote:
Nevermind, looks like I needed these:
./src/contrib/ec2/bin/image/create-hadoop-image-remote
./src/contrib/ec2/bin/create-hadoop-image
-- Jim
On Wed, May 7, 2008 at 2:23 PM, Jim R. Wilson
<[EMAIL PROTECTED]> wrote:
Thanks Chris,
Where do I get this supposed "image/create-hadoop-remote" script? I
couldn't `find` it anywhere within the hadoop svn tree, and the link
in the hadoop wiki is broken :/
-- Jim
On Wed, May 7, 2008 at 2:04 PM, Chris K Wensel <[EMAIL PROTECTED]>
wrote:
You don't need 0.17 to use the scripts mentioned in the EC2 wiki
page. Just
grab contrib/ec2 from the 0.17.0 branch.
as for images, you will need to update the image/create-hadoop-
remote bash
script to download and install hbase.
and update hadoop-init to start it with the proper properties.
once you look at these scripts, it should be fairly obvious what
you need
to do.
then just run 'create-image' command to stuff this new image into
one of
your buckets.
enjoy
ckw
On May 7, 2008, at 11:12 AM, Jim R. Wilson wrote:
Hi all,
I'm about to embark on a mystical journey through hosted web-
services
with my trusted friend hbase. Here are some questions for my
fellow
travelers:
1) Has anyone done this before? If so, what lifesaving tips can
you offer?
2) Should I attempt to build an hdfs out of ec2 persistent
storage, or
just use S3?
3) How many images will I need? Just one, or master/slave?
4) What version of hadoop/hbase should I use? (The hadoop/ec2
instructions[1] seem to favor the unreleased 0.17, but there
doesn't
seem to be a public image with 0.17 at the ready)
Thanks in advance for any advice, I'm gearing up for quite a
trip :)
[1] http://wiki.apache.org/hadoop/AmazonEC2
-- Jim R. Wilson (jimbojw)
Chris K Wensel
[EMAIL PROTECTED]
http://chris.wensel.net/
http://www.cascading.org/
Chris K Wensel
[EMAIL PROTECTED]
http://chris.wensel.net/
http://www.cascading.org/