[ 
https://issues.apache.org/jira/browse/HADOOP-997?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tom White updated HADOOP-997:
-----------------------------

    Attachment: HADOOP-997-v2.patch

Version 2 patch that addresses all the issues listed above, except for clearing 
out the temp directory on startup.

The exception while using "hadoop fs" was because the visibility of 
FileSystemStore was package private, which caused problems when creating 
proxies that are used from other packages. (I hadn't tested properly since I 
hadn't enabled retries for S3 when I did my testing!)

I'm now encountering a different problem when trying to copy local files to S3. 
I get the error

put: No such file or directory

I'm still investigating the cause of this to see if it is related. 

> Implement S3 retry mechanism for failed block transfers
> -------------------------------------------------------
>
>                 Key: HADOOP-997
>                 URL: https://issues.apache.org/jira/browse/HADOOP-997
>             Project: Hadoop
>          Issue Type: Improvement
>          Components: fs
>    Affects Versions: 0.11.0
>            Reporter: Tom White
>         Assigned To: Tom White
>         Attachments: HADOOP-997-v2.patch, HADOOP-997.patch
>
>
> HADOOP-882 improves S3FileSystem so that when certain communications problems 
> with S3 occur the operation is retried. However, the retry mechanism cannot 
> handle a block transfer failure, since blocks may be very large and we don't 
> want to buffer them in memory. This improvement is to write a wrapper (using 
> java.lang.reflect.Proxy if possible - see discussion in HADOOP-882) that can 
> retry block transfers.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to