[ 
https://issues.apache.org/jira/browse/HADOOP-15573?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16527496#comment-16527496
 ] 

Steve Loughran commented on HADOOP-15573:
-----------------------------------------

Root cause is probably that {{S3AUtils.translateDynamoDBException}} doesn't 
translate {{AmazonDynamoDBException}} with error code == 400 to 
AccessDeniedException; do that and the retry logic will automatically convert 
to a fail fast

> s3guard set-capacity to not retry on an access denied exception
> ---------------------------------------------------------------
>
>                 Key: HADOOP-15573
>                 URL: https://issues.apache.org/jira/browse/HADOOP-15573
>             Project: Hadoop Common
>          Issue Type: Sub-task
>          Components: fs/s3
>            Reporter: Steve Loughran
>            Priority: Minor
>
> when you call {{hadoop s3guard set-capacity}} with restricted access, you are 
> (correctly) blocked by AWS, but the client keeps retrying. It should fail 
> fast on a 400/AccessDenied



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to