Thanks, I should of been more clear.  I am not attempting to perform a map 
reduce job.  I was literally trying to use the FileSystem abstraction (rather 
than using jets3t library directly to access S3.  I was assuming it handled the 
mocking of directories in s3 (as it is not a native feature of that file 
system).

C
On Aug 28, 2012, at 10:16 AM, Manoj Babu <[email protected]> wrote:

> Hi,
> 
> Here is an example, might help you.
> 
> http://muhammadkhojaye.blogspot.in/2012/04/how-to-run-amazon-elastic-mapreduce-job.html
>  
> 
> Cheers!
> Manoj.
> 
> 
> 
> On Tue, Aug 28, 2012 at 12:55 PM, Chris Collins <[email protected]> 
> wrote:
> 
> 
> 
> Hi I am trying to use the Hadoop filesystem abstraction with S3 but in my 
> tinkering I am not having a great deal of success.  I am particularly 
> interested in the ability to mimic a directory structure (since s3 native 
> doesnt do it).
> 
> Can anyone point me to some good example usage of Hadoop FileSystem with s3?
> 
> I created a few directories using transit and AWS S3 console for test.  Doing 
> a liststatus of the bucket returns a FileStatus object of the directory 
> created but if I try to do a liststatus of that path I am getting a 404:
> 
> org.apache.hadoop.fs.s3.S3Exception: org.jets3t.service.S3ServiceException: 
> Request Error. HEAD '/aaaa' on Host ....
> 
> Probably not the best list to look for help, any clues appreciated.
> 
> C
> 
> 

Reply via email to