[ 
https://issues.apache.org/jira/browse/HADOOP-19032?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17805169#comment-17805169
 ] 

Steve Loughran commented on HADOOP-19032:
-----------------------------------------


{code}
[ERROR] Tests run: 17, Failures: 0, Errors: 1, Skipped: 1, Time elapsed: 48.129 
s <<< FAILURE! - in org.apache.hadoop.fs.s3a.fileContext.ITestS3AFileContextURI
[ERROR] 
testCreateDirectory(org.apache.hadoop.fs.s3a.fileContext.ITestS3AFileContextURI)
  Time elapsed: 13.4 s  <<< ERROR!
org.apache.hadoop.fs.s3a.AWSS3IOException: 
Remove S3 Dir Markers on 
s3a://stevel-london/Users/stevel/Projects/hadoop-trunk/hadoop-tools/hadoop-aws/target/test-dir/7/testContextURI/createTest:
 org.apache.hadoop.fs.s3a.impl.MultiObjectDeleteException: 
[S3Error(Key=Users/stevel/Projects/hadoop-trunk/hadoop-tools/hadoop-aws/target/test-dir/7/testContextURI/createTest/()&^%$#@!~_+}{><?/,
 Code=InternalError, Message=We encountered an internal error. Please try 
again.)] (Service: Amazon S3, Status Code: 200, Request ID: 
null):MultiObjectDeleteException: InternalError: 
Users/stevel/Projects/hadoop-trunk/hadoop-tools/hadoop-aws/target/test-dir/7/testContextURI/createTest/()&^%$#@!~_+}{><?/:
 We encountered an internal error. Please try again.
: 
[S3Error(Key=Users/stevel/Projects/hadoop-trunk/hadoop-tools/hadoop-aws/target/test-dir/7/testContextURI/createTest/()&^%$#@!~_+}{><?/,
 Code=InternalError, Message=We encountered an internal error. Please try 
again.)] (Service: Amazon S3, Status Code: 200, Request ID: null)
        at 
org.apache.hadoop.fs.s3a.impl.MultiObjectDeleteException.translateException(MultiObjectDeleteException.java:136)
        at 
org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:347)
        at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:124)
        at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:163)
        at 
org.apache.hadoop.fs.s3a.impl.DeleteOperation.asyncDeleteAction(DeleteOperation.java:445)
        at 
org.apache.hadoop.fs.s3a.impl.DeleteOperation.lambda$submitDelete$2(DeleteOperation.java:403)
        at 
org.apache.hadoop.fs.store.audit.AuditingFunctions.lambda$callableWithinAuditSpan$3(AuditingFunctions.java:119)
        at 
org.apache.hadoop.fs.s3a.impl.CallableSupplier.get(CallableSupplier.java:88)
        at 
java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
        at 
org.apache.hadoop.util.SemaphoredDelegatingExecutor$RunnableWithPermitRelease.run(SemaphoredDelegatingExecutor.java:225)
        at 
org.apache.hadoop.util.SemaphoredDelegatingExecutor$RunnableWithPermitRelease.run(SemaphoredDelegatingExecutor.java:225)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.hadoop.fs.s3a.impl.MultiObjectDeleteException: 
[S3Error(Key=Users/stevel/Projects/hadoop-trunk/hadoop-tools/hadoop-aws/target/test-dir/7/testContextURI/createTest/()&^%$#@!~_+}{><?/,
 Code=InternalError, Message=We encountered an internal error. Please try 
again.)] (Service: Amazon S3, Status Code: 200, Request ID: null)
        at 
org.apache.hadoop.fs.s3a.S3AFileSystem.deleteObjects(S3AFileSystem.java:3174)
        at 
org.apache.hadoop.fs.s3a.S3AFileSystem.removeKeysS3(S3AFileSystem.java:3405)
        at 
org.apache.hadoop.fs.s3a.S3AFileSystem.removeKeys(S3AFileSystem.java:3475)
        at 
org.apache.hadoop.fs.s3a.S3AFileSystem$OperationCallbacksImpl.removeKeys(S3AFileSystem.java:2491)
        at 
org.apache.hadoop.fs.s3a.impl.DeleteOperation.lambda$asyncDeleteAction$8(DeleteOperation.java:447)
        at org.apache.hadoop.fs.s3a.Invoker.lambda$once$0(Invoker.java:165)
        at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:122)
        ... 11 more


{code}


> MultiObjectDeleteException bulk delete of odd filenames
> -------------------------------------------------------
>
>                 Key: HADOOP-19032
>                 URL: https://issues.apache.org/jira/browse/HADOOP-19032
>             Project: Hadoop Common
>          Issue Type: Sub-task
>          Components: fs/s3
>    Affects Versions: 3.4.0
>            Reporter: Steve Loughran
>            Priority: Major
>
> Possibly transient. note bucket is versioned.
> {code}
> org.apache.hadoop.fs.s3a.AWSS3IOException: 
> Remove S3 Dir Markers on 
> s3a://stevel-london/Users/stevel/Projects/hadoop-trunk/hadoop-tools/hadoop-aws/target/test-dir/7/testContextURI/createTest:
>  org.apache.hadoop.fs.s3a.impl.MultiObjectDeleteException: 
> [S3Error(Key=Users/stevel/Projects/hadoop-trunk/hadoop-tools/hadoop-aws/target/test-dir/7/testContextURI/createTest/()&^%$#@!~_+}{><?/,
>  Code=InternalError, Message=We encountered an internal error. Please try 
> again.)] (Service: Amazon S3, Status Code: 200, Request ID: 
> null):MultiObjectDeleteException: 
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to