If you are using S3 as your file store then you don't need to run HDFS
(and indeed HDFS will not start up if you try).

Cheers,
Tom

2009/12/17 Rekha Joshi <[email protected]>:
> Not sure what the whole error is, but you can always alternatively try this -
> <property>
>  <name>fs.default.name</name>
>  <value>s3://BUCKET</value>
> </property>
>
> <property>
>  <name>fs.s3.awsAccessKeyId</name>
>  <value>ID</value>
> </property>
>
> <property>
>  <name>fs.s3.awsSecretAccessKey</name>
>  <value>SECRET</value>
> </property>
>
> And I am not sure what is the base hadoop version on S3, but possibly if S3 
> wiki is correct try updating conf/hadoop-site.xml
>
> Cheers,
> /R
>
> On 12/18/09 10:23 AM, "松柳" <[email protected]> wrote:
>
> Hi all,
>    I tried to run my hadoop program on S3 by following this wiki page:
> http://wiki.apache.org/hadoop/AmazonS3
>    I configured the core-site.xml by adding
>
> <property>
>  <name>fs.default.name</name>
>  <value>s3://ID:sec...@bucket</value>
> </property>
>
>    and I specified the accesskey and secretkey by using the URI
> format:s3://ID:sec...@bucket
>
> however, it fails and datanodes reports:
>
> NumberFormatException....
> ...
>
> Is this the right way to config hadoop running on s3? if so, whats the
> problem?
>
> Regards
> Song
>
>

Reply via email to