[
https://issues.apache.org/jira/browse/HADOOP-18112?focusedWorklogId=738186&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-738186
]
ASF GitHub Bot logged work on HADOOP-18112:
-------------------------------------------
Author: ASF GitHub Bot
Created on: 08/Mar/22 15:28
Start Date: 08/Mar/22 15:28
Worklog Time Spent: 10m
Work Description: hadoop-yetus commented on pull request #4045:
URL: https://github.com/apache/hadoop/pull/4045#issuecomment-1061900516
:broken_heart: **-1 overall**
| Vote | Subsystem | Runtime | Logfile | Comment |
|:----:|----------:|--------:|:--------:|:-------:|
| +0 :ok: | reexec | 12m 22s | | Docker mode activated. |
|||| _ Prechecks _ |
| +1 :green_heart: | dupname | 0m 0s | | No case conflicting files
found. |
| +0 :ok: | codespell | 0m 1s | | codespell was not available. |
| +1 :green_heart: | @author | 0m 0s | | The patch does not contain
any @author tags. |
| +1 :green_heart: | test4tests | 0m 0s | | The patch appears to
include 7 new or modified test files. |
|||| _ trunk Compile Tests _ |
| +0 :ok: | mvndep | 3m 19s | | Maven dependency ordering for branch |
| -1 :x: | mvninstall | 12m 47s |
[/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4045/3/artifact/out/branch-mvninstall-root.txt)
| root in trunk failed. |
| +1 :green_heart: | compile | 26m 56s | | trunk passed with JDK
Ubuntu-11.0.14+9-Ubuntu-0ubuntu2.20.04 |
| +1 :green_heart: | compile | 21m 55s | | trunk passed with JDK
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
| +1 :green_heart: | checkstyle | 4m 28s | | trunk passed |
| +1 :green_heart: | mvnsite | 2m 49s | | trunk passed |
| +1 :green_heart: | javadoc | 1m 48s | | trunk passed with JDK
Ubuntu-11.0.14+9-Ubuntu-0ubuntu2.20.04 |
| +1 :green_heart: | javadoc | 2m 30s | | trunk passed with JDK
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
| +1 :green_heart: | spotbugs | 4m 31s | | trunk passed |
| +1 :green_heart: | shadedclient | 22m 17s | | branch has no errors
when building and testing our client artifacts. |
|||| _ Patch Compile Tests _ |
| +0 :ok: | mvndep | 0m 34s | | Maven dependency ordering for patch |
| +1 :green_heart: | mvninstall | 1m 38s | | the patch passed |
| +1 :green_heart: | compile | 23m 11s | | the patch passed with JDK
Ubuntu-11.0.14+9-Ubuntu-0ubuntu2.20.04 |
| +1 :green_heart: | javac | 23m 11s | | the patch passed |
| +1 :green_heart: | compile | 21m 51s | | the patch passed with JDK
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
| +1 :green_heart: | javac | 21m 51s | | the patch passed |
| +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks
issues. |
| -0 :warning: | checkstyle | 3m 34s |
[/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4045/3/artifact/out/results-checkstyle-root.txt)
| root: The patch generated 6 new + 6 unchanged - 0 fixed = 12 total (was 6)
|
| +1 :green_heart: | mvnsite | 2m 39s | | the patch passed |
| +1 :green_heart: | javadoc | 1m 50s | | the patch passed with JDK
Ubuntu-11.0.14+9-Ubuntu-0ubuntu2.20.04 |
| +1 :green_heart: | javadoc | 2m 20s | | the patch passed with JDK
Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
| +1 :green_heart: | spotbugs | 4m 39s | | the patch passed |
| +1 :green_heart: | shadedclient | 22m 5s | | patch has no errors
when building and testing our client artifacts. |
|||| _ Other Tests _ |
| +1 :green_heart: | unit | 17m 50s | | hadoop-common in the patch
passed. |
| +1 :green_heart: | unit | 2m 23s | | hadoop-aws in the patch passed.
|
| +1 :green_heart: | asflicense | 0m 56s | | The patch does not
generate ASF License warnings. |
| | | 224m 7s | | |
| Subsystem | Report/Notes |
|----------:|:-------------|
| Docker | ClientAPI=1.41 ServerAPI=1.41 base:
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4045/3/artifact/out/Dockerfile
|
| GITHUB PR | https://github.com/apache/hadoop/pull/4045 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall
mvnsite unit shadedclient spotbugs checkstyle codespell |
| uname | Linux 2d19360a5841 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | dev-support/bin/hadoop.sh |
| git revision | trunk / 4fc41b3a8601e52204b040c06c868079b1b0080c |
| Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
| Multi-JDK versions |
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.14+9-Ubuntu-0ubuntu2.20.04
/usr/lib/jvm/java-8-openjdk-amd64:Private
Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 |
| Test Results |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4045/3/testReport/ |
| Max. process+thread count | 3138 (vs. ulimit of 5500) |
| modules | C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws
U: . |
| Console output |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4045/3/console |
| versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
| Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
This message was automatically generated.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 738186)
Time Spent: 2h 50m (was: 2h 40m)
> Rename operation fails during multi object delete of size more than 1000.
> -------------------------------------------------------------------------
>
> Key: HADOOP-18112
> URL: https://issues.apache.org/jira/browse/HADOOP-18112
> Project: Hadoop Common
> Issue Type: Sub-task
> Components: fs/s3
> Affects Versions: 3.3.1
> Reporter: Mukund Thakur
> Assignee: Mukund Thakur
> Priority: Critical
> Labels: pull-request-available
> Time Spent: 2h 50m
> Remaining Estimate: 0h
>
> We see below exception during multi object delete of more than 1000 keys in
> one go during rename operation.
>
> {noformat}
> org.apache.hadoop.fs.s3a.AWSBadRequestException: rename
> s3a://ms-targeting-prod-cdp-aws-dr-bkt/data/ms-targeting-prod-hbase/hbase/.tmp/data/default/dr-productionL.Address
> to
> s3a://ms-targeting-prod-cdp-aws-dr-bkt/user/root/.Trash/Current/data/ms-targetin
> g-prod-hbase/hbase/.tmp/data/default/dr-productionL.Address16438377847941643837797901
> on
> s3a://ms-targeting-prod-cdp-aws-dr-bkt/data/ms-targeting-prod-hbase/hbase/.tmp/data/default/dr-productionL.Address:
> com.amazonaws.services.s3.model.AmazonS3Exception
> : The XML you provided was not well-formed or did not validate against our
> published schema (Service: Amazon S3; Status Code: 400; Error Code:
> MalformedXML; Request ID: XZ8PGAQHP0FGHPYS; S3 Extended Request ID:
> vTG8c+koukzQ8yMRGd9BvWfmRwkCZ3fAs/EOiAV5S9E
> JjLqFTNCgDOKokuus5W600Z5iOa/iQBI=; Proxy: null), S3 Extended Request ID:
> vTG8c+koukzQ8yMRGd9BvWfmRwkCZ3fAs/EOiAV5S9EJjLqFTNCgDOKokuus5W600Z5iOa/iQBI=:MalformedXML:
> The XML you provided was not well-formed or did not validate against our
> published schema
> (Service: Amazon S3; Status Code: 400; Error Code: MalformedXML; Request ID:
> XZ8PGAQHP0FGHPYS; S3 Extended Request ID:
> vTG8c+koukzQ8yMRGd9BvWfmRwkCZ3fAs/EOiAV5S9EJjLqFTNCgDOKokuus5W600Z5iOa/iQBI=;
> Proxy: null)
> at
> org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:247)
> at
> org.apache.hadoop.fs.s3a.s3guard.RenameTracker.convertToIOException(RenameTracker.java:267)
> at
> org.apache.hadoop.fs.s3a.s3guard.RenameTracker.deleteFailed(RenameTracker.java:198)
> at
> org.apache.hadoop.fs.s3a.impl.RenameOperation.removeSourceObjects(RenameOperation.java:706)
> at
> org.apache.hadoop.fs.s3a.impl.RenameOperation.completeActiveCopiesAndDeleteSources(RenameOperation.java:274)
> at
> org.apache.hadoop.fs.s3a.impl.RenameOperation.recursiveDirectoryRename(RenameOperation.java:484)
> at
> org.apache.hadoop.fs.s3a.impl.RenameOperation.execute(RenameOperation.java:312)
> at
> org.apache.hadoop.fs.s3a.S3AFileSystem.innerRename(S3AFileSystem.java:1912)
> at
> org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$rename$7(S3AFileSystem.java:1759)
> at
> org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.lambda$trackDurationOfOperation$5(IOStatisticsBinding.java:499)
> at
> org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.trackDuration(IOStatisticsBinding.java:444)
> at
> org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2250)
> at
> org.apache.hadoop.fs.s3a.S3AFileSystem.rename(S3AFileSystem.java:1757)
> at org.apache.hadoop.fs.FileSystem.rename(FileSystem.java:1605)
> at
> org.apache.hadoop.fs.TrashPolicyDefault.moveToTrash(TrashPolicyDefault.java:186)
> at org.apache.hadoop.fs.Trash.moveToTrash(Trash.java:110){noformat}
--
This message was sent by Atlassian Jira
(v8.20.1#820001)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]