[
https://issues.apache.org/jira/browse/HADOOP-18938?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17841054#comment-17841054
]
ASF GitHub Bot commented on HADOOP-18938:
-----------------------------------------
hadoop-yetus commented on PR #6466:
URL: https://github.com/apache/hadoop/pull/6466#issuecomment-2078647702
:confetti_ball: **+1 overall**
| Vote | Subsystem | Runtime | Logfile | Comment |
|:----:|----------:|--------:|:--------:|:-------:|
|||| _ Prechecks _ |
| +1 :green_heart: | dupname | 0m 01s | | No case conflicting files
found. |
| +0 :ok: | spotbugs | 0m 00s | | spotbugs executables are not
available. |
| +0 :ok: | codespell | 0m 00s | | codespell was not available. |
| +0 :ok: | detsecrets | 0m 00s | | detect-secrets was not available.
|
| +1 :green_heart: | @author | 0m 01s | | The patch does not contain
any @author tags. |
| +1 :green_heart: | test4tests | 0m 00s | | The patch appears to
include 2 new or modified test files. |
|||| _ trunk Compile Tests _ |
| +1 :green_heart: | mvninstall | 108m 16s | | trunk passed |
| +1 :green_heart: | compile | 5m 41s | | trunk passed |
| +1 :green_heart: | checkstyle | 5m 21s | | trunk passed |
| +1 :green_heart: | mvnsite | 6m 00s | | trunk passed |
| +1 :green_heart: | javadoc | 5m 40s | | trunk passed |
| +1 :green_heart: | shadedclient | 173m 14s | | branch has no errors
when building and testing our client artifacts. |
|||| _ Patch Compile Tests _ |
| +1 :green_heart: | mvninstall | 4m 11s | | the patch passed |
| +1 :green_heart: | compile | 2m 42s | | the patch passed |
| +1 :green_heart: | javac | 2m 42s | | the patch passed |
| +1 :green_heart: | blanks | 0m 00s | | The patch has no blanks
issues. |
| +1 :green_heart: | checkstyle | 2m 25s | | the patch passed |
| +1 :green_heart: | mvnsite | 2m 52s | | the patch passed |
| +1 :green_heart: | javadoc | 2m 34s | | the patch passed |
| +1 :green_heart: | shadedclient | 191m 32s | | patch has no errors
when building and testing our client artifacts. |
|||| _ Other Tests _ |
| +1 :green_heart: | asflicense | 6m 49s | | The patch does not
generate ASF License warnings. |
| | | 501m 31s | | |
| Subsystem | Report/Notes |
|----------:|:-------------|
| GITHUB PR | https://github.com/apache/hadoop/pull/6466 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
| uname | MINGW64_NT-10.0-17763 c92fbd9e5cae 3.4.10-87d57229.x86_64
2024-02-14 20:17 UTC x86_64 Msys |
| Build tool | maven |
| Personality | /c/hadoop/dev-support/bin/hadoop.sh |
| git revision | trunk / 97360ba71f24df4cfc2d44f2f05c1bee0129a968 |
| Default Java | Azul Systems, Inc.-1.8.0_332-b09 |
| Test Results |
https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6466/1/testReport/
|
| modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws |
| Console output |
https://ci-hadoop.apache.org/job/hadoop-multibranch-windows-10/job/PR-6466/1/console
|
| versions | git=2.44.0.windows.1 |
| Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
This message was automatically generated.
> S3A region logic to handle vpce and non standard endpoints
> -----------------------------------------------------------
>
> Key: HADOOP-18938
> URL: https://issues.apache.org/jira/browse/HADOOP-18938
> Project: Hadoop Common
> Issue Type: Sub-task
> Components: fs/s3
> Affects Versions: 3.4.0
> Reporter: Ahmar Suhail
> Priority: Major
> Labels: pull-request-available
>
> For non standard endpoints such as VPCE the region parsing added in
> HADOOP-18908 doesn't work. This is expected as that logic is only meant to be
> used for standard endpoints.
> If you are using a non-standard endpoint, check if a region is also provided,
> else fail fast.
> Also update documentation to explain to region and endpoint behaviour with
> SDK V2.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]