[GitHub] [hadoop] saintstack merged pull request #4246: HDFS-16540. Data locality is lost when DataNode pod restarts in kubernetes (#4170)
saintstack merged PR #4246: URL: https://github.com/apache/hadoop/pull/4246 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] saintstack commented on pull request #4246: HDFS-16540. Data locality is lost when DataNode pod restarts in kubernetes (#4170)
saintstack commented on PR #4246: URL: https://github.com/apache/hadoop/pull/4246#issuecomment-1127210267 I ran the two test below in loops locally. TestBPOfferService.testMissBlocksWhenReregister TestUnderReplicatedBlocks.testSetRepIncWithUnderReplicatedBlocks The first failed once out of ten cycles both when the patch was in place and when not (jibes w/ what we see here in test runs where sometimes it fails but not always). TestUnderReplicatedBlocks.testSetRepIncWithUnderReplicatedBlocks shows up consistently but when I run it locally in multiple cycles, it passes whether the patch is applied or not. I see that in the last full branch-3.3 run, back on May 5th (https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.3-java8-linux-x86_64/54/), it failed for same reason (the May 12th run was incomplete). This test is about block replication where the PR here is about a minor adjustment in NN node accounting. Unrelated I'd say. Pushing the backport. Will push in the morning. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Created] (HADOOP-18236) Remove duplicate locks in NetworkTopology
ZanderXu created HADOOP-18236: - Summary: Remove duplicate locks in NetworkTopology Key: HADOOP-18236 URL: https://issues.apache.org/jira/browse/HADOOP-18236 Project: Hadoop Common Issue Type: Improvement Reporter: ZanderXu Assignee: ZanderXu During reading the hadoop NetworkTopology.java, I suspect there is a duplicate lock. chooseRandom(line 532), and code is: {code:java} final int availableNodes; if (excludedScope == null) { availableNodes = countNumOfAvailableNodes(scope, excludedNodes); } else { netlock.readLock().lock(); try { availableNodes = countNumOfAvailableNodes(scope, excludedNodes) - countNumOfAvailableNodes(excludedScope, excludedNodes); } finally { netlock.readLock().unlock(); } } {code} All the places where called `chooseRandom` have the global read lock, so the internal read lock is duplicated. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] zhangxiping1 commented on a diff in pull request #4269: HDFS-16570 RBF: The router using MultipleDestinationMountTableResolve…
zhangxiping1 commented on code in PR #4269: URL: https://github.com/apache/hadoop/pull/4269#discussion_r873288398 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/test/java/org/apache/hadoop/hdfs/server/federation/router/TestRouterTrashMultipleDestinationMountTableResolver.java: ## @@ -0,0 +1,196 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.hdfs.server.federation.router; + +import java.io.IOException; +import java.net.URISyntaxException; +import java.util.HashMap; +import java.util.Map; + +import org.junit.AfterClass; +import org.junit.BeforeClass; +import org.junit.Test; + +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.FileStatus; +import org.apache.hadoop.fs.FileSystem; +import org.apache.hadoop.fs.Path; +import org.apache.hadoop.fs.Trash; +import org.apache.hadoop.fs.permission.FsPermission; +import org.apache.hadoop.hdfs.DFSClient; +import org.apache.hadoop.hdfs.DFSConfigKeys; +import org.apache.hadoop.hdfs.DFSTestUtil; +import org.apache.hadoop.hdfs.server.federation.MiniRouterDFSCluster; +import org.apache.hadoop.hdfs.server.federation.RouterConfigBuilder; +import org.apache.hadoop.hdfs.server.federation.StateStoreDFSCluster; +import org.apache.hadoop.hdfs.server.federation.resolver.MountTableManager; +import org.apache.hadoop.hdfs.server.federation.resolver.MountTableResolver; +import org.apache.hadoop.hdfs.server.federation.resolver.MultipleDestinationMountTableResolver; +import org.apache.hadoop.hdfs.server.federation.resolver.order.DestinationOrder; +import org.apache.hadoop.hdfs.server.federation.store.protocol.AddMountTableEntryRequest; +import org.apache.hadoop.hdfs.server.federation.store.protocol.AddMountTableEntryResponse; +import org.apache.hadoop.hdfs.server.federation.store.records.MountTable; +import org.apache.hadoop.security.UserGroupInformation; + +import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertTrue; + +/** + * This is a test through the Router move data to the Trash with + * MultipleDestinationMountTableResolver. + */ +public class TestRouterTrashMultipleDestinationMountTableResolver { + + private static StateStoreDFSCluster cluster; + private static MiniRouterDFSCluster.RouterContext routerContext; + private static MountTableResolver resolver; + private static MiniRouterDFSCluster.NamenodeContext nnContextNs0; + private static MiniRouterDFSCluster.NamenodeContext nnContextNs1; + private static FileSystem nnFsNs0; + private static FileSystem nnFsNs1; + + private static String ns0; + private static String ns1; + private static final String TEST_USER = "test-trash"; + private static final String MOUNT_POINT = "/home/data"; + private static final String MOUNT_POINT_CHILD_DIR = MOUNT_POINT + "/test"; + private static final String FILE_NS0 = MOUNT_POINT_CHILD_DIR + "/fileNs0"; + private static final String FILE_NS1 = MOUNT_POINT_CHILD_DIR + "/fileNs1"; + private static final String TRASH_ROOT = "/user/" + TEST_USER + "/.Trash"; + private static final String CURRENT = "/Current"; + + @BeforeClass + public static void globalSetUp() throws Exception { +// Build and start a federated cluster +cluster = new StateStoreDFSCluster(false, 2, +MultipleDestinationMountTableResolver.class); +Configuration routerConf = +new RouterConfigBuilder().stateStore().admin().quota().rpc().build(); + +Configuration hdfsConf = new Configuration(false); +hdfsConf.setBoolean(DFSConfigKeys.DFS_NAMENODE_ACLS_ENABLED_KEY, true); +hdfsConf.set("fs.trash.interval", "1440"); +hdfsConf.set("fs.trash.checkpoint.interval", "1440"); +cluster.addRouterOverrides(routerConf); +cluster.addNamenodeOverrides(hdfsConf); +cluster.startCluster(); +cluster.startRouters(); +cluster.waitClusterUp(); + +ns0 = cluster.getNameservices().get(0); +ns1 = cluster.getNameservices().get(1); + +nnContextNs0 = cluster.getNamenode(ns0, null); +nnFsNs0 = nnContextNs0.getFileSystem(); +nnContextNs1 = cluster.getNamenode(ns1, null); +nnFsNs1 = nnContextNs1.getFileSystem(); + +routerContext = cluster.getRandomRouter(); +resolver = +
[GitHub] [hadoop] zhangxiping1 commented on a diff in pull request #4269: HDFS-16570 RBF: The router using MultipleDestinationMountTableResolve…
zhangxiping1 commented on code in PR #4269: URL: https://github.com/apache/hadoop/pull/4269#discussion_r873288073 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/test/java/org/apache/hadoop/hdfs/server/federation/router/TestRouterTrashMultipleDestinationMountTableResolver.java: ## @@ -0,0 +1,196 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.hdfs.server.federation.router; + +import java.io.IOException; +import java.net.URISyntaxException; +import java.util.HashMap; +import java.util.Map; + +import org.junit.AfterClass; +import org.junit.BeforeClass; +import org.junit.Test; + +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.FileStatus; +import org.apache.hadoop.fs.FileSystem; +import org.apache.hadoop.fs.Path; +import org.apache.hadoop.fs.Trash; +import org.apache.hadoop.fs.permission.FsPermission; +import org.apache.hadoop.hdfs.DFSClient; +import org.apache.hadoop.hdfs.DFSConfigKeys; +import org.apache.hadoop.hdfs.DFSTestUtil; +import org.apache.hadoop.hdfs.server.federation.MiniRouterDFSCluster; +import org.apache.hadoop.hdfs.server.federation.RouterConfigBuilder; +import org.apache.hadoop.hdfs.server.federation.StateStoreDFSCluster; +import org.apache.hadoop.hdfs.server.federation.resolver.MountTableManager; +import org.apache.hadoop.hdfs.server.federation.resolver.MountTableResolver; +import org.apache.hadoop.hdfs.server.federation.resolver.MultipleDestinationMountTableResolver; +import org.apache.hadoop.hdfs.server.federation.resolver.order.DestinationOrder; +import org.apache.hadoop.hdfs.server.federation.store.protocol.AddMountTableEntryRequest; +import org.apache.hadoop.hdfs.server.federation.store.protocol.AddMountTableEntryResponse; +import org.apache.hadoop.hdfs.server.federation.store.records.MountTable; +import org.apache.hadoop.security.UserGroupInformation; + +import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertTrue; + +/** + * This is a test through the Router move data to the Trash with + * MultipleDestinationMountTableResolver. + */ +public class TestRouterTrashMultipleDestinationMountTableResolver { + + private static StateStoreDFSCluster cluster; + private static MiniRouterDFSCluster.RouterContext routerContext; + private static MountTableResolver resolver; + private static MiniRouterDFSCluster.NamenodeContext nnContextNs0; + private static MiniRouterDFSCluster.NamenodeContext nnContextNs1; + private static FileSystem nnFsNs0; + private static FileSystem nnFsNs1; + + private static String ns0; + private static String ns1; + private static final String TEST_USER = "test-trash"; + private static final String MOUNT_POINT = "/home/data"; + private static final String MOUNT_POINT_CHILD_DIR = MOUNT_POINT + "/test"; + private static final String FILE_NS0 = MOUNT_POINT_CHILD_DIR + "/fileNs0"; + private static final String FILE_NS1 = MOUNT_POINT_CHILD_DIR + "/fileNs1"; + private static final String TRASH_ROOT = "/user/" + TEST_USER + "/.Trash"; + private static final String CURRENT = "/Current"; + + @BeforeClass + public static void globalSetUp() throws Exception { +// Build and start a federated cluster +cluster = new StateStoreDFSCluster(false, 2, +MultipleDestinationMountTableResolver.class); +Configuration routerConf = +new RouterConfigBuilder().stateStore().admin().quota().rpc().build(); + +Configuration hdfsConf = new Configuration(false); +hdfsConf.setBoolean(DFSConfigKeys.DFS_NAMENODE_ACLS_ENABLED_KEY, true); +hdfsConf.set("fs.trash.interval", "1440"); Review Comment: OK -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] zhangxiping1 commented on a diff in pull request #4269: HDFS-16570 RBF: The router using MultipleDestinationMountTableResolve…
zhangxiping1 commented on code in PR #4269: URL: https://github.com/apache/hadoop/pull/4269#discussion_r873287952 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/router/RouterRpcServer.java: ## @@ -1896,6 +1897,9 @@ public FederationRPCMetrics getRPCMetrics() { boolean isPathAll(final String path) { if (subclusterResolver instanceof MountTableResolver) { try { +if(isTrashPath(path)){ + return true; Review Comment: I can make two changes in isPathAll fuction: 1. Process the Trash path, remove the prefix, and check 2. Check whether it is the trash path If we delete, mkdir, LS on the recycle bin data, if reslove gets multiple Remotelocation, then we should operate on all remotelocation, so I'm going to choose the second option. But the first is certainly fine. If you think there's something wrong, you can talk me out of it, thank you. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] zhangxiping1 commented on a diff in pull request #4269: HDFS-16570 RBF: The router using MultipleDestinationMountTableResolve…
zhangxiping1 commented on code in PR #4269: URL: https://github.com/apache/hadoop/pull/4269#discussion_r873283844 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/router/RouterRpcServer.java: ## @@ -1896,6 +1897,9 @@ public FederationRPCMetrics getRPCMetrics() { boolean isPathAll(final String path) { if (subclusterResolver instanceof MountTableResolver) { try { +if(isTrashPath(path)){ + return true; Review Comment: OK,In [HDFS-16024](https://issues.apache.org/jira/browse/HDFS-16024). The trash path is resolved by removing the prefix of the trash path. I think in the isPathAll() function, the prefix of the Trash path also needs to be processed. The isPathAll method is used to determine whether any resolved remotelocation is performed. Therefore, the isPathAll method should be consistent with the SRC path (excluding the trash prefix). Because all remotelocation resolved previously is SRC based. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4246: HDFS-16540. Data locality is lost when DataNode pod restarts in kubernetes (#4170)
hadoop-yetus commented on PR #4246: URL: https://github.com/apache/hadoop/pull/4246#issuecomment-1127151370 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 40s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ branch-3.3 Compile Tests _ | | +1 :green_heart: | mvninstall | 35m 58s | | branch-3.3 passed | | +1 :green_heart: | compile | 1m 34s | | branch-3.3 passed | | +1 :green_heart: | checkstyle | 1m 14s | | branch-3.3 passed | | +1 :green_heart: | mvnsite | 1m 39s | | branch-3.3 passed | | +1 :green_heart: | javadoc | 1m 52s | | branch-3.3 passed | | +1 :green_heart: | spotbugs | 3m 35s | | branch-3.3 passed | | +1 :green_heart: | shadedclient | 27m 7s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 20s | | the patch passed | | +1 :green_heart: | compile | 1m 14s | | the patch passed | | +1 :green_heart: | javac | 1m 14s | | the patch passed | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4246/14/artifact/out/blanks-eol.txt) | The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | checkstyle | 0m 48s | | the patch passed | | +1 :green_heart: | mvnsite | 1m 20s | | the patch passed | | +1 :green_heart: | javadoc | 1m 33s | | the patch passed | | +1 :green_heart: | spotbugs | 3m 17s | | the patch passed | | +1 :green_heart: | shadedclient | 26m 19s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 190m 25s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4246/14/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 1m 14s | | The patch does not generate ASF License warnings. | | | | 298m 53s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.blockmanagement.TestUnderReplicatedBlocks | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4246/14/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4246 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 8f80b78e9fdd 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / a50952249eecd4f1d1f0479f1ca1643c3dbec925 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4246/14/testReport/ | | Max. process+thread count | 3224 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4246/14/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] simbadzina opened a new pull request, #4311: HDFS-13522: IPC changes to support observer reads through routers.
simbadzina opened a new pull request, #4311: URL: https://github.com/apache/hadoop/pull/4311 ### Description of PR IPC changes so that clients send their txn number to routers and in turn routers will forward this to the namenodes. The code to direct routers to the observer namenodes will be in a follow up PR. ### How was this patch tested? Ran unit tests in module in Intellij ### For code changes: - [ x] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18229) Fix Hadoop Common Java Doc Error
[ https://issues.apache.org/jira/browse/HADOOP-18229?focusedWorklogId=770617=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-770617 ] ASF GitHub Bot logged work on HADOOP-18229: --- Author: ASF GitHub Bot Created on: 15/May/22 22:20 Start Date: 15/May/22 22:20 Worklog Time Spent: 10m Work Description: slfan1989 commented on code in PR #4292: URL: https://github.com/apache/hadoop/pull/4292#discussion_r873235638 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/ReconfigurationException.java: ## @@ -59,6 +59,10 @@ public ReconfigurationException() { /** * Create a new instance of {@link ReconfigurationException}. + * @param property property name Review Comment: I will fix it, thanks. Issue Time Tracking --- Worklog Id: (was: 770617) Time Spent: 10h 40m (was: 10.5h) > Fix Hadoop Common Java Doc Error > > > Key: HADOOP-18229 > URL: https://issues.apache.org/jira/browse/HADOOP-18229 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: fanshilun >Assignee: fanshilun >Priority: Major > Labels: pull-request-available > Time Spent: 10h 40m > Remaining Estimate: 0h > > I found that when hadoop-multibranch compiled PR-4266, some errors would pop > up, I tried to solve it > The wrong compilation information is as follows, I try to fix the Error > information > {code:java} > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:432: > error: exception not thrown: java.io.IOException > [ERROR]* @throws IOException > [ERROR] ^ > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:885: > error: unknown tag: username > [ERROR]* E.g. link: ^/user/(?\\w+) => > s3://$user.apache.com/_${user} > [ERROR] ^ > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:885: > error: bad use of '>' > [ERROR]* E.g. link: ^/user/(?\\w+) => > s3://$user.apache.com/_${user} > [ERROR]^ > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:910: > error: unknown tag: username > [ERROR]* > .linkRegex.replaceresolveddstpath:_:-#.^/user/(?\w+) > {code} -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 commented on a diff in pull request #4292: HADOOP-18229. Fix Hadoop-Common JavaDoc Error
slfan1989 commented on code in PR #4292: URL: https://github.com/apache/hadoop/pull/4292#discussion_r873235638 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/ReconfigurationException.java: ## @@ -59,6 +59,10 @@ public ReconfigurationException() { /** * Create a new instance of {@link ReconfigurationException}. + * @param property property name Review Comment: I will fix it, thanks. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18229) Fix Hadoop Common Java Doc Error
[ https://issues.apache.org/jira/browse/HADOOP-18229?focusedWorklogId=770615=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-770615 ] ASF GitHub Bot logged work on HADOOP-18229: --- Author: ASF GitHub Bot Created on: 15/May/22 22:19 Start Date: 15/May/22 22:19 Worklog Time Spent: 10m Work Description: slfan1989 commented on code in PR #4292: URL: https://github.com/apache/hadoop/pull/4292#discussion_r873235500 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java: ## @@ -2785,15 +2791,16 @@ public void setClass(String name, Class theClass, Class xface) { set(name, theClass.getName()); } - /** + /** * Get a local file under a directory named by dirsProp with * the given path. If dirsProp contains multiple directories, * then one is chosen based on path's hash code. If the selected * directory does not exist, an attempt is made to create it. - * + * * @param dirsProp directory in which to locate the file. - * @param path file-path. + * @param path file-path. Review Comment: this place shouldn't have extra spaces,i will fix it. Issue Time Tracking --- Worklog Id: (was: 770615) Time Spent: 10h 20m (was: 10h 10m) > Fix Hadoop Common Java Doc Error > > > Key: HADOOP-18229 > URL: https://issues.apache.org/jira/browse/HADOOP-18229 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: fanshilun >Assignee: fanshilun >Priority: Major > Labels: pull-request-available > Time Spent: 10h 20m > Remaining Estimate: 0h > > I found that when hadoop-multibranch compiled PR-4266, some errors would pop > up, I tried to solve it > The wrong compilation information is as follows, I try to fix the Error > information > {code:java} > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:432: > error: exception not thrown: java.io.IOException > [ERROR]* @throws IOException > [ERROR] ^ > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:885: > error: unknown tag: username > [ERROR]* E.g. link: ^/user/(?\\w+) => > s3://$user.apache.com/_${user} > [ERROR] ^ > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:885: > error: bad use of '>' > [ERROR]* E.g. link: ^/user/(?\\w+) => > s3://$user.apache.com/_${user} > [ERROR]^ > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:910: > error: unknown tag: username > [ERROR]* > .linkRegex.replaceresolveddstpath:_:-#.^/user/(?\w+) > {code} -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18229) Fix Hadoop Common Java Doc Error
[ https://issues.apache.org/jira/browse/HADOOP-18229?focusedWorklogId=770616=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-770616 ] ASF GitHub Bot logged work on HADOOP-18229: --- Author: ASF GitHub Bot Created on: 15/May/22 22:19 Start Date: 15/May/22 22:19 Worklog Time Spent: 10m Work Description: slfan1989 commented on code in PR #4292: URL: https://github.com/apache/hadoop/pull/4292#discussion_r873235500 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java: ## @@ -2785,15 +2791,16 @@ public void setClass(String name, Class theClass, Class xface) { set(name, theClass.getName()); } - /** + /** * Get a local file under a directory named by dirsProp with * the given path. If dirsProp contains multiple directories, * then one is chosen based on path's hash code. If the selected * directory does not exist, an attempt is made to create it. - * + * * @param dirsProp directory in which to locate the file. - * @param path file-path. + * @param path file-path. Review Comment: This place shouldn't have extra spaces,i will fix it. Issue Time Tracking --- Worklog Id: (was: 770616) Time Spent: 10.5h (was: 10h 20m) > Fix Hadoop Common Java Doc Error > > > Key: HADOOP-18229 > URL: https://issues.apache.org/jira/browse/HADOOP-18229 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: fanshilun >Assignee: fanshilun >Priority: Major > Labels: pull-request-available > Time Spent: 10.5h > Remaining Estimate: 0h > > I found that when hadoop-multibranch compiled PR-4266, some errors would pop > up, I tried to solve it > The wrong compilation information is as follows, I try to fix the Error > information > {code:java} > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:432: > error: exception not thrown: java.io.IOException > [ERROR]* @throws IOException > [ERROR] ^ > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:885: > error: unknown tag: username > [ERROR]* E.g. link: ^/user/(?\\w+) => > s3://$user.apache.com/_${user} > [ERROR] ^ > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:885: > error: bad use of '>' > [ERROR]* E.g. link: ^/user/(?\\w+) => > s3://$user.apache.com/_${user} > [ERROR]^ > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:910: > error: unknown tag: username > [ERROR]* > .linkRegex.replaceresolveddstpath:_:-#.^/user/(?\w+) > {code} -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 commented on a diff in pull request #4292: HADOOP-18229. Fix Hadoop-Common JavaDoc Error
slfan1989 commented on code in PR #4292: URL: https://github.com/apache/hadoop/pull/4292#discussion_r873235500 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java: ## @@ -2785,15 +2791,16 @@ public void setClass(String name, Class theClass, Class xface) { set(name, theClass.getName()); } - /** + /** * Get a local file under a directory named by dirsProp with * the given path. If dirsProp contains multiple directories, * then one is chosen based on path's hash code. If the selected * directory does not exist, an attempt is made to create it. - * + * * @param dirsProp directory in which to locate the file. - * @param path file-path. + * @param path file-path. Review Comment: This place shouldn't have extra spaces,i will fix it. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 commented on a diff in pull request #4292: HADOOP-18229. Fix Hadoop-Common JavaDoc Error
slfan1989 commented on code in PR #4292: URL: https://github.com/apache/hadoop/pull/4292#discussion_r873235500 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java: ## @@ -2785,15 +2791,16 @@ public void setClass(String name, Class theClass, Class xface) { set(name, theClass.getName()); } - /** + /** * Get a local file under a directory named by dirsProp with * the given path. If dirsProp contains multiple directories, * then one is chosen based on path's hash code. If the selected * directory does not exist, an attempt is made to create it. - * + * * @param dirsProp directory in which to locate the file. - * @param path file-path. + * @param path file-path. Review Comment: this place shouldn't have extra spaces,i will fix it. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18229) Fix Hadoop Common Java Doc Error
[ https://issues.apache.org/jira/browse/HADOOP-18229?focusedWorklogId=770614=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-770614 ] ASF GitHub Bot logged work on HADOOP-18229: --- Author: ASF GitHub Bot Created on: 15/May/22 22:17 Start Date: 15/May/22 22:17 Worklog Time Spent: 10m Work Description: slfan1989 commented on code in PR #4292: URL: https://github.com/apache/hadoop/pull/4292#discussion_r873235371 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java: ## @@ -3582,7 +3591,9 @@ public void writeXml(Writer out) throws IOException { * the configuration, this method throws an {@link IllegalArgumentException}. * * + * @param propertyName xml property name Review Comment: Thanks for your help reviewing the code, I will fix this. Issue Time Tracking --- Worklog Id: (was: 770614) Time Spent: 10h 10m (was: 10h) > Fix Hadoop Common Java Doc Error > > > Key: HADOOP-18229 > URL: https://issues.apache.org/jira/browse/HADOOP-18229 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: fanshilun >Assignee: fanshilun >Priority: Major > Labels: pull-request-available > Time Spent: 10h 10m > Remaining Estimate: 0h > > I found that when hadoop-multibranch compiled PR-4266, some errors would pop > up, I tried to solve it > The wrong compilation information is as follows, I try to fix the Error > information > {code:java} > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:432: > error: exception not thrown: java.io.IOException > [ERROR]* @throws IOException > [ERROR] ^ > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:885: > error: unknown tag: username > [ERROR]* E.g. link: ^/user/(?\\w+) => > s3://$user.apache.com/_${user} > [ERROR] ^ > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:885: > error: bad use of '>' > [ERROR]* E.g. link: ^/user/(?\\w+) => > s3://$user.apache.com/_${user} > [ERROR]^ > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:910: > error: unknown tag: username > [ERROR]* > .linkRegex.replaceresolveddstpath:_:-#.^/user/(?\w+) > {code} -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 commented on a diff in pull request #4292: HADOOP-18229. Fix Hadoop-Common JavaDoc Error
slfan1989 commented on code in PR #4292: URL: https://github.com/apache/hadoop/pull/4292#discussion_r873235371 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java: ## @@ -3582,7 +3591,9 @@ public void writeXml(Writer out) throws IOException { * the configuration, this method throws an {@link IllegalArgumentException}. * * + * @param propertyName xml property name Review Comment: Thanks for your help reviewing the code, I will fix this. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] virajjasani closed pull request #4268: [Test] Release 3.3.3 RC testing
virajjasani closed pull request #4268: [Test] Release 3.3.3 RC testing URL: https://github.com/apache/hadoop/pull/4268 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18228) Update hadoop-vote to use HADOOP_RC_VERSION dir
[ https://issues.apache.org/jira/browse/HADOOP-18228?focusedWorklogId=770611=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-770611 ] ASF GitHub Bot logged work on HADOOP-18228: --- Author: ASF GitHub Bot Created on: 15/May/22 20:20 Start Date: 15/May/22 20:20 Worklog Time Spent: 10m Work Description: virajjasani commented on PR #4272: URL: https://github.com/apache/hadoop/pull/4272#issuecomment-1127019376 @steveloughran @iwasakims @saintstack Could you please review this PR? Hadoop 3.3.3 RC0 and RC1 have both been verified against this change. Issue Time Tracking --- Worklog Id: (was: 770611) Time Spent: 0.5h (was: 20m) > Update hadoop-vote to use HADOOP_RC_VERSION dir > --- > > Key: HADOOP-18228 > URL: https://issues.apache.org/jira/browse/HADOOP-18228 > Project: Hadoop Common > Issue Type: Task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Minor > Labels: pull-request-available > Time Spent: 0.5h > Remaining Estimate: 0h > > The recent changes in release script requires a minor change in hadoop-vote > to use Hadoop RC version dir before verifying signature and checksum of > .tar.gz files. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] virajjasani commented on pull request #4272: HADOOP-18228. Update hadoop-vote to use HADOOP_RC_VERSION dir
virajjasani commented on PR #4272: URL: https://github.com/apache/hadoop/pull/4272#issuecomment-1127019376 @steveloughran @iwasakims @saintstack Could you please review this PR? Hadoop 3.3.3 RC0 and RC1 have both been verified against this change. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18229) Fix Hadoop Common Java Doc Error
[ https://issues.apache.org/jira/browse/HADOOP-18229?focusedWorklogId=770609=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-770609 ] ASF GitHub Bot logged work on HADOOP-18229: --- Author: ASF GitHub Bot Created on: 15/May/22 19:32 Start Date: 15/May/22 19:32 Worklog Time Spent: 10m Work Description: goiri commented on code in PR #4292: URL: https://github.com/apache/hadoop/pull/4292#discussion_r872617446 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java: ## @@ -3582,7 +3591,9 @@ public void writeXml(Writer out) throws IOException { * the configuration, this method throws an {@link IllegalArgumentException}. * * + * @param propertyName xml property name Review Comment: Be consistent with the . at the end ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java: ## @@ -2785,15 +2791,16 @@ public void setClass(String name, Class theClass, Class xface) { set(name, theClass.getName()); } - /** + /** * Get a local file under a directory named by dirsProp with * the given path. If dirsProp contains multiple directories, * then one is chosen based on path's hash code. If the selected * directory does not exist, an attempt is made to create it. - * + * * @param dirsProp directory in which to locate the file. - * @param path file-path. + * @param path file-path. Review Comment: Why the extra spaces? ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/ReconfigurationException.java: ## @@ -59,6 +59,10 @@ public ReconfigurationException() { /** * Create a new instance of {@link ReconfigurationException}. + * @param property property name Review Comment: . consistency Issue Time Tracking --- Worklog Id: (was: 770609) Time Spent: 10h (was: 9h 50m) > Fix Hadoop Common Java Doc Error > > > Key: HADOOP-18229 > URL: https://issues.apache.org/jira/browse/HADOOP-18229 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: fanshilun >Assignee: fanshilun >Priority: Major > Labels: pull-request-available > Time Spent: 10h > Remaining Estimate: 0h > > I found that when hadoop-multibranch compiled PR-4266, some errors would pop > up, I tried to solve it > The wrong compilation information is as follows, I try to fix the Error > information > {code:java} > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:432: > error: exception not thrown: java.io.IOException > [ERROR]* @throws IOException > [ERROR] ^ > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:885: > error: unknown tag: username > [ERROR]* E.g. link: ^/user/(?\\w+) => > s3://$user.apache.com/_${user} > [ERROR] ^ > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:885: > error: bad use of '>' > [ERROR]* E.g. link: ^/user/(?\\w+) => > s3://$user.apache.com/_${user} > [ERROR]^ > [ERROR] > /home/jenkins/jenkins-agent/workspace/hadoop-multibranch_PR-4266/ubuntu-focal/src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/viewfs/InodeTree.java:910: > error: unknown tag: username > [ERROR]* > .linkRegex.replaceresolveddstpath:_:-#.^/user/(?\w+) > {code} -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] goiri commented on a diff in pull request #4292: HADOOP-18229. Fix Hadoop-Common JavaDoc Error
goiri commented on code in PR #4292: URL: https://github.com/apache/hadoop/pull/4292#discussion_r872617446 ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java: ## @@ -3582,7 +3591,9 @@ public void writeXml(Writer out) throws IOException { * the configuration, this method throws an {@link IllegalArgumentException}. * * + * @param propertyName xml property name Review Comment: Be consistent with the . at the end ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java: ## @@ -2785,15 +2791,16 @@ public void setClass(String name, Class theClass, Class xface) { set(name, theClass.getName()); } - /** + /** * Get a local file under a directory named by dirsProp with * the given path. If dirsProp contains multiple directories, * then one is chosen based on path's hash code. If the selected * directory does not exist, an attempt is made to create it. - * + * * @param dirsProp directory in which to locate the file. - * @param path file-path. + * @param path file-path. Review Comment: Why the extra spaces? ## hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/ReconfigurationException.java: ## @@ -59,6 +59,10 @@ public ReconfigurationException() { /** * Create a new instance of {@link ReconfigurationException}. + * @param property property name Review Comment: . consistency -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] goiri commented on a diff in pull request #4269: HDFS-16570 RBF: The router using MultipleDestinationMountTableResolve…
goiri commented on code in PR #4269: URL: https://github.com/apache/hadoop/pull/4269#discussion_r873212041 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/test/java/org/apache/hadoop/hdfs/server/federation/router/TestRouterTrashMultipleDestinationMountTableResolver.java: ## @@ -0,0 +1,196 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.hdfs.server.federation.router; + +import java.io.IOException; +import java.net.URISyntaxException; +import java.util.HashMap; +import java.util.Map; + +import org.junit.AfterClass; +import org.junit.BeforeClass; +import org.junit.Test; + +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.FileStatus; +import org.apache.hadoop.fs.FileSystem; +import org.apache.hadoop.fs.Path; +import org.apache.hadoop.fs.Trash; +import org.apache.hadoop.fs.permission.FsPermission; +import org.apache.hadoop.hdfs.DFSClient; +import org.apache.hadoop.hdfs.DFSConfigKeys; +import org.apache.hadoop.hdfs.DFSTestUtil; +import org.apache.hadoop.hdfs.server.federation.MiniRouterDFSCluster; +import org.apache.hadoop.hdfs.server.federation.RouterConfigBuilder; +import org.apache.hadoop.hdfs.server.federation.StateStoreDFSCluster; +import org.apache.hadoop.hdfs.server.federation.resolver.MountTableManager; +import org.apache.hadoop.hdfs.server.federation.resolver.MountTableResolver; +import org.apache.hadoop.hdfs.server.federation.resolver.MultipleDestinationMountTableResolver; +import org.apache.hadoop.hdfs.server.federation.resolver.order.DestinationOrder; +import org.apache.hadoop.hdfs.server.federation.store.protocol.AddMountTableEntryRequest; +import org.apache.hadoop.hdfs.server.federation.store.protocol.AddMountTableEntryResponse; +import org.apache.hadoop.hdfs.server.federation.store.records.MountTable; +import org.apache.hadoop.security.UserGroupInformation; + +import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertTrue; + +/** + * This is a test through the Router move data to the Trash with + * MultipleDestinationMountTableResolver. + */ +public class TestRouterTrashMultipleDestinationMountTableResolver { + + private static StateStoreDFSCluster cluster; + private static MiniRouterDFSCluster.RouterContext routerContext; + private static MountTableResolver resolver; + private static MiniRouterDFSCluster.NamenodeContext nnContextNs0; + private static MiniRouterDFSCluster.NamenodeContext nnContextNs1; + private static FileSystem nnFsNs0; + private static FileSystem nnFsNs1; + + private static String ns0; + private static String ns1; + private static final String TEST_USER = "test-trash"; + private static final String MOUNT_POINT = "/home/data"; + private static final String MOUNT_POINT_CHILD_DIR = MOUNT_POINT + "/test"; + private static final String FILE_NS0 = MOUNT_POINT_CHILD_DIR + "/fileNs0"; + private static final String FILE_NS1 = MOUNT_POINT_CHILD_DIR + "/fileNs1"; + private static final String TRASH_ROOT = "/user/" + TEST_USER + "/.Trash"; + private static final String CURRENT = "/Current"; + + @BeforeClass + public static void globalSetUp() throws Exception { +// Build and start a federated cluster +cluster = new StateStoreDFSCluster(false, 2, +MultipleDestinationMountTableResolver.class); +Configuration routerConf = +new RouterConfigBuilder().stateStore().admin().quota().rpc().build(); + +Configuration hdfsConf = new Configuration(false); +hdfsConf.setBoolean(DFSConfigKeys.DFS_NAMENODE_ACLS_ENABLED_KEY, true); +hdfsConf.set("fs.trash.interval", "1440"); Review Comment: setInt? and 24 * 60 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/router/RouterRpcServer.java: ## @@ -1896,6 +1897,9 @@ public FederationRPCMetrics getRPCMetrics() { boolean isPathAll(final String path) { if (subclusterResolver instanceof MountTableResolver) { try { +if(isTrashPath(path)){ + return true; Review Comment: Is that the case? Add a comment justifying. ##
[jira] [Work logged] (HADOOP-18224) Upgrade maven compiler plugin to 3.10.1 and maven javadoc plugin to 3.4.0
[ https://issues.apache.org/jira/browse/HADOOP-18224?focusedWorklogId=770604=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-770604 ] ASF GitHub Bot logged work on HADOOP-18224: --- Author: ASF GitHub Bot Created on: 15/May/22 18:42 Start Date: 15/May/22 18:42 Worklog Time Spent: 10m Work Description: virajjasani commented on PR #4267: URL: https://github.com/apache/hadoop/pull/4267#issuecomment-1126994284 > Hi, @[aajisaka](https://github.com/aajisaka) I am fixing the java doc problem of hadoop-common, it is expected to be fixed in a few days (2-3days) 4292-pr #4292 @slfan1989 Both PRs (#4292 and current PR #4267) have different purpose. For this PR, we are bumping maven compiler and javadoc plugin to avoid pulling-in vulnerable log4j dependencies (also, these plugin versions that we are using are almost a decade old). As part of javadoc plugin upgrade, we are seeing new javadoc errors, whereas on PR #4292, several existing Javadoc errors are being being resolved, which is great. But once this PR gets in, we would see few more additional errors for both Java 8 and 11 builds. > Maybe the tag check become more strict. We can fix them in separate issues. I agree with @aajisaka that these new errors should be fixed in separate Jiras. And also the fact that tag checks have become stricter and are resulting in new Javadoc errors. If this PR gets merged first and then if we retrigger Jenkins build on PR #4292, we would see new errors. Issue Time Tracking --- Worklog Id: (was: 770604) Time Spent: 4h (was: 3h 50m) > Upgrade maven compiler plugin to 3.10.1 and maven javadoc plugin to 3.4.0 > - > > Key: HADOOP-18224 > URL: https://issues.apache.org/jira/browse/HADOOP-18224 > Project: Hadoop Common > Issue Type: Task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > Time Spent: 4h > Remaining Estimate: 0h > > Currently we are using maven-compiler-plugin 3.1 version, which is quite old > (2013) and it's also pulling in vulnerable log4j dependency: > {code:java} > [INFO] > org.apache.maven.plugins:maven-compiler-plugin:maven-plugin:3.1:runtime > [INFO] org.apache.maven.plugins:maven-compiler-plugin:jar:3.1 > [INFO] org.apache.maven:maven-plugin-api:jar:2.0.9 > [INFO] org.apache.maven:maven-artifact:jar:2.0.9 > [INFO] org.codehaus.plexus:plexus-utils:jar:1.5.1 > [INFO] org.apache.maven:maven-core:jar:2.0.9 > [INFO] org.apache.maven:maven-settings:jar:2.0.9 > [INFO] org.apache.maven:maven-plugin-parameter-documenter:jar:2.0.9 > ... > ... > ... > [INFO] log4j:log4j:jar:1.2.12 > [INFO] commons-logging:commons-logging-api:jar:1.1 > [INFO] com.google.collections:google-collections:jar:1.0 > [INFO] junit:junit:jar:3.8.2 > {code} > > We should upgrade to 3.10.1 (latest Mar, 2022) version of > maven-compiler-plugin. -- This message was sent by Atlassian Jira (v8.20.7#820007) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] virajjasani commented on pull request #4267: HADOOP-18224. Upgrade maven compiler plugin to 3.10.1 and maven javadoc plugin to 3.4.0
virajjasani commented on PR #4267: URL: https://github.com/apache/hadoop/pull/4267#issuecomment-1126994284 > Hi, @[aajisaka](https://github.com/aajisaka) I am fixing the java doc problem of hadoop-common, it is expected to be fixed in a few days (2-3days) 4292-pr #4292 @slfan1989 Both PRs (#4292 and current PR #4267) have different purpose. For this PR, we are bumping maven compiler and javadoc plugin to avoid pulling-in vulnerable log4j dependencies (also, these plugin versions that we are using are almost a decade old). As part of javadoc plugin upgrade, we are seeing new javadoc errors, whereas on PR #4292, several existing Javadoc errors are being being resolved, which is great. But once this PR gets in, we would see few more additional errors for both Java 8 and 11 builds. > Maybe the tag check become more strict. We can fix them in separate issues. I agree with @aajisaka that these new errors should be fixed in separate Jiras. And also the fact that tag checks have become stricter and are resulting in new Javadoc errors. If this PR gets merged first and then if we retrigger Jenkins build on PR #4292, we would see new errors. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18229) Fix Hadoop Common Java Doc Error
[ https://issues.apache.org/jira/browse/HADOOP-18229?focusedWorklogId=770600=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-770600 ] ASF GitHub Bot logged work on HADOOP-18229: --- Author: ASF GitHub Bot Created on: 15/May/22 17:38 Start Date: 15/May/22 17:38 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4292: URL: https://github.com/apache/hadoop/pull/4292#issuecomment-1126984474 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 17s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 6s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 38m 47s | | trunk passed | | +1 :green_heart: | compile | 26m 36s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 23m 0s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 2m 14s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 8s | | trunk passed | | -1 :x: | javadoc | 1m 49s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/45/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common in trunk failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | +1 :green_heart: | javadoc | 2m 24s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 20s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 21s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 23m 45s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 10s | | the patch passed | | +1 :green_heart: | compile | 24m 34s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 24m 34s | | the patch passed | | +1 :green_heart: | compile | 20m 53s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 53s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 2m 12s | | hadoop-common-project/hadoop-common: The patch generated 0 new + 3340 unchanged - 88 fixed = 3340 total (was 3428) | | +1 :green_heart: | mvnsite | 2m 11s | | the patch passed | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | -1 :x: | javadoc | 1m 41s | [/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/45/artifact/out/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 595 new + 44 unchanged - 62 fixed = 639 total (was 106) | | -1 :x: | javadoc | 1m 19s | [/patch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/45/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-common in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | +1 :green_heart: | spotbugs | 3m 9s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 43s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 48s | | hadoop-common in the patch
[GitHub] [hadoop] hadoop-yetus commented on pull request #4292: HADOOP-18229. Fix Hadoop-Common JavaDoc Error
hadoop-yetus commented on PR #4292: URL: https://github.com/apache/hadoop/pull/4292#issuecomment-1126984474 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 17s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 6s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 38m 47s | | trunk passed | | +1 :green_heart: | compile | 26m 36s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 23m 0s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 2m 14s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 8s | | trunk passed | | -1 :x: | javadoc | 1m 49s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/45/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common in trunk failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | +1 :green_heart: | javadoc | 2m 24s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 20s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 21s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 23m 45s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 10s | | the patch passed | | +1 :green_heart: | compile | 24m 34s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 24m 34s | | the patch passed | | +1 :green_heart: | compile | 20m 53s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 53s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 2m 12s | | hadoop-common-project/hadoop-common: The patch generated 0 new + 3340 unchanged - 88 fixed = 3340 total (was 3428) | | +1 :green_heart: | mvnsite | 2m 11s | | the patch passed | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | -1 :x: | javadoc | 1m 41s | [/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/45/artifact/out/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 595 new + 44 unchanged - 62 fixed = 639 total (was 106) | | -1 :x: | javadoc | 1m 19s | [/patch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/45/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-common in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | +1 :green_heart: | spotbugs | 3m 9s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 43s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 48s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 34s | | The patch does not generate ASF License warnings. | | | | 229m 0s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/45/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4292 |
[GitHub] [hadoop] hadoop-yetus commented on pull request #4280: YARN-11133. YarnClient gets the wrong EffectiveMinCapacity value
hadoop-yetus commented on PR #4280: URL: https://github.com/apache/hadoop/pull/4280#issuecomment-1126978424 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 9s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 39m 59s | | trunk passed | | +1 :green_heart: | compile | 1m 1s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 0m 54s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 50s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 59s | | trunk passed | | +1 :green_heart: | javadoc | 1m 7s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 56s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 5s | | trunk passed | | +1 :green_heart: | shadedclient | 24m 41s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 42s | | the patch passed | | +1 :green_heart: | compile | 0m 46s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 0m 46s | | the patch passed | | +1 :green_heart: | compile | 0m 41s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 41s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 30s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 43s | | the patch passed | | +1 :green_heart: | javadoc | 0m 44s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 43s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 48s | | the patch passed | | +1 :green_heart: | shadedclient | 24m 9s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 4m 45s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 42s | | The patch does not generate ASF License warnings. | | | | 109m 45s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4280 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 77fe62aa5776 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 9dc0ed876863a3f7c3c3a3c4a25b16cbf10efdea | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/2/testReport/ | | Max. process+thread count | 525 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common U: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/2/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To
[GitHub] [hadoop] hadoop-yetus commented on pull request #4280: YARN-11133. YarnClient gets the wrong EffectiveMinCapacity value
hadoop-yetus commented on PR #4280: URL: https://github.com/apache/hadoop/pull/4280#issuecomment-1126976903 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 52s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 40m 15s | | trunk passed | | +1 :green_heart: | compile | 0m 58s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 0m 52s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 48s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 57s | | trunk passed | | +1 :green_heart: | javadoc | 1m 4s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 53s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 4s | | trunk passed | | +1 :green_heart: | shadedclient | 24m 14s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | -1 :x: | mvninstall | 0m 39s | [/patch-mvninstall-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/4/artifact/out/patch-mvninstall-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt) | hadoop-yarn-common in the patch failed. | | -1 :x: | compile | 0m 45s | [/patch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/4/artifact/out/patch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-yarn-common in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | javac | 0m 45s | [/patch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/4/artifact/out/patch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-yarn-common in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | compile | 0m 40s | [/patch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/4/artifact/out/patch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-yarn-common in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | -1 :x: | javac | 0m 40s | [/patch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/4/artifact/out/patch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-yarn-common in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/4/artifact/out/blanks-eol.txt) | The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | checkstyle | 0m 29s | | the patch passed | | -1 :x: | mvnsite | 0m 42s | [/patch-mvnsite-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/4/artifact/out/patch-mvnsite-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt) | hadoop-yarn-common in the patch failed. | | +1 :green_heart: | javadoc | 0m 43s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 42s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | -1 :x: | spotbugs | 0m 40s |
[GitHub] [hadoop] hadoop-yetus commented on pull request #4280: YARN-11133. YarnClient gets the wrong EffectiveMinCapacity value
hadoop-yetus commented on PR #4280: URL: https://github.com/apache/hadoop/pull/4280#issuecomment-1126975090 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 44s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 38m 40s | | trunk passed | | +1 :green_heart: | compile | 0m 47s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 0m 44s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 39s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 47s | | trunk passed | | +1 :green_heart: | javadoc | 0m 56s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 45s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 54s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 37s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | -1 :x: | mvninstall | 0m 37s | [/patch-mvninstall-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/3/artifact/out/patch-mvninstall-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt) | hadoop-yarn-common in the patch failed. | | -1 :x: | compile | 0m 43s | [/patch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/3/artifact/out/patch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-yarn-common in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | javac | 0m 43s | [/patch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/3/artifact/out/patch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-yarn-common in the patch failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | -1 :x: | compile | 0m 37s | [/patch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/3/artifact/out/patch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-yarn-common in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | -1 :x: | javac | 0m 37s | [/patch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/3/artifact/out/patch-compile-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-yarn-common in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/3/artifact/out/blanks-eol.txt) | The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | checkstyle | 0m 26s | | the patch passed | | -1 :x: | mvnsite | 0m 38s | [/patch-mvnsite-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4280/3/artifact/out/patch-mvnsite-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt) | hadoop-yarn-common in the patch failed. | | +1 :green_heart: | javadoc | 0m 38s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 39s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | -1 :x: | spotbugs | 0m 34s |
[GitHub] [hadoop] goiri merged pull request #4274: YARN-11122. Support getClusterNodes API in FederationClientInterceptor
goiri merged PR #4274: URL: https://github.com/apache/hadoop/pull/4274 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18229) Fix Hadoop Common Java Doc Error
[ https://issues.apache.org/jira/browse/HADOOP-18229?focusedWorklogId=770590=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-770590 ] ASF GitHub Bot logged work on HADOOP-18229: --- Author: ASF GitHub Bot Created on: 15/May/22 16:14 Start Date: 15/May/22 16:14 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4292: URL: https://github.com/apache/hadoop/pull/4292#issuecomment-1126972145 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 6s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 8s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 37m 31s | | trunk passed | | +1 :green_heart: | compile | 23m 43s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 21m 15s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 2m 15s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 10s | | trunk passed | | -1 :x: | javadoc | 1m 43s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/44/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common in trunk failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | +1 :green_heart: | javadoc | 2m 18s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 15s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 48s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 24m 15s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 5s | | the patch passed | | +1 :green_heart: | compile | 25m 54s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 25m 54s | | the patch passed | | +1 :green_heart: | compile | 23m 17s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 23m 17s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 57s | | hadoop-common-project/hadoop-common: The patch generated 0 new + 3253 unchanged - 86 fixed = 3253 total (was 3339) | | +1 :green_heart: | mvnsite | 2m 8s | | the patch passed | | +1 :green_heart: | xml | 0m 2s | | The patch has no ill-formed XML file. | | -1 :x: | javadoc | 1m 34s | [/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/44/artifact/out/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 688 new + 44 unchanged - 62 fixed = 732 total (was 106) | | -1 :x: | javadoc | 1m 11s | [/patch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/44/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-common in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | +1 :green_heart: | spotbugs | 3m 6s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 16s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 19m 35s | | hadoop-common in the patch
[GitHub] [hadoop] hadoop-yetus commented on pull request #4292: HADOOP-18229. Fix Hadoop-Common JavaDoc Error
hadoop-yetus commented on PR #4292: URL: https://github.com/apache/hadoop/pull/4292#issuecomment-1126972145 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 6s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 8s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 37m 31s | | trunk passed | | +1 :green_heart: | compile | 23m 43s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 21m 15s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 2m 15s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 10s | | trunk passed | | -1 :x: | javadoc | 1m 43s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/44/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common in trunk failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | +1 :green_heart: | javadoc | 2m 18s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 15s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 48s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 24m 15s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 5s | | the patch passed | | +1 :green_heart: | compile | 25m 54s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 25m 54s | | the patch passed | | +1 :green_heart: | compile | 23m 17s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 23m 17s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 57s | | hadoop-common-project/hadoop-common: The patch generated 0 new + 3253 unchanged - 86 fixed = 3253 total (was 3339) | | +1 :green_heart: | mvnsite | 2m 8s | | the patch passed | | +1 :green_heart: | xml | 0m 2s | | The patch has no ill-formed XML file. | | -1 :x: | javadoc | 1m 34s | [/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/44/artifact/out/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 688 new + 44 unchanged - 62 fixed = 732 total (was 106) | | -1 :x: | javadoc | 1m 11s | [/patch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/44/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-common in the patch failed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07. | | +1 :green_heart: | spotbugs | 3m 6s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 16s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 19m 35s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 23s | | The patch does not generate ASF License warnings. | | | | 225m 44s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/44/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4292 |
[GitHub] [hadoop] zhuzilong2013 commented on pull request #4280: YARN-11133. YarnClient gets the wrong EffectiveMinCapacity value
zhuzilong2013 commented on PR #4280: URL: https://github.com/apache/hadoop/pull/4280#issuecomment-1126963902 @hemanthboyina [hemanthboyina](https://github.com/hemanthboyina) Thanks for your patient guidance, I have extend an Unit Test for my change. Please review it. Thanks. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tomscut commented on pull request #4310: HDFS-16579. Fix build failure for TestBlockManager on branch-3.2
tomscut commented on PR #4310: URL: https://github.com/apache/hadoop/pull/4310#issuecomment-1126927642 Thanks @tasanuma for merging this. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18229) Fix Hadoop Common Java Doc Error
[ https://issues.apache.org/jira/browse/HADOOP-18229?focusedWorklogId=770580=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-770580 ] ASF GitHub Bot logged work on HADOOP-18229: --- Author: ASF GitHub Bot Created on: 15/May/22 12:12 Start Date: 15/May/22 12:12 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4292: URL: https://github.com/apache/hadoop/pull/4292#issuecomment-1126925259 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 2s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 8s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 38m 13s | | trunk passed | | +1 :green_heart: | compile | 23m 25s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 20m 52s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 2m 17s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 13s | | trunk passed | | -1 :x: | javadoc | 1m 49s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/43/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common in trunk failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | +1 :green_heart: | javadoc | 2m 17s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 17s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 37s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 24m 8s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 6s | | the patch passed | | +1 :green_heart: | compile | 22m 36s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 22m 36s | | the patch passed | | +1 :green_heart: | compile | 20m 52s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 52s | | the patch passed | | +1 :green_heart: | blanks | 0m 1s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 2m 10s | | hadoop-common-project/hadoop-common: The patch generated 0 new + 3246 unchanged - 82 fixed = 3246 total (was 3328) | | +1 :green_heart: | mvnsite | 2m 11s | | the patch passed | | -1 :x: | javadoc | 1m 42s | [/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/43/artifact/out/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 94 new + 5 unchanged - 101 fixed = 99 total (was 106) | | +1 :green_heart: | javadoc | 2m 17s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 12s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 59s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 38s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 36s | | The patch does not generate ASF License warnings. | | | | 222m 11s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/43/artifact/out/Dockerfile | | GITHUB PR |
[GitHub] [hadoop] hadoop-yetus commented on pull request #4292: HADOOP-18229. Fix Hadoop-Common JavaDoc Error
hadoop-yetus commented on PR #4292: URL: https://github.com/apache/hadoop/pull/4292#issuecomment-1126925259 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 2s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 8s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 38m 13s | | trunk passed | | +1 :green_heart: | compile | 23m 25s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 20m 52s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 2m 17s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 13s | | trunk passed | | -1 :x: | javadoc | 1m 49s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/43/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common in trunk failed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1. | | +1 :green_heart: | javadoc | 2m 17s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 17s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 37s | | branch has no errors when building and testing our client artifacts. | | -0 :warning: | patch | 24m 8s | | Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 6s | | the patch passed | | +1 :green_heart: | compile | 22m 36s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 22m 36s | | the patch passed | | +1 :green_heart: | compile | 20m 52s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 52s | | the patch passed | | +1 :green_heart: | blanks | 0m 1s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 2m 10s | | hadoop-common-project/hadoop-common: The patch generated 0 new + 3246 unchanged - 82 fixed = 3246 total (was 3328) | | +1 :green_heart: | mvnsite | 2m 11s | | the patch passed | | -1 :x: | javadoc | 1m 42s | [/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/43/artifact/out/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 94 new + 5 unchanged - 101 fixed = 99 total (was 106) | | +1 :green_heart: | javadoc | 2m 17s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 12s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 59s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 38s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 36s | | The patch does not generate ASF License warnings. | | | | 222m 11s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4292/43/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4292 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 72c9429cc5b1 4.15.0-65-generic #74-Ubuntu SMP Tue Sep 17 17:06:04 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / a21f0119815047c81c5c6e9044e3b79a876349bb |
[GitHub] [hadoop] tasanuma commented on pull request #4310: HDFS-16579. Fix build failure for TestBlockManager on branch-3.2
tasanuma commented on PR #4310: URL: https://github.com/apache/hadoop/pull/4310#issuecomment-1126890311 Thanks for fixing it quickly, @tomscut. Thanks for reporting the issue and your review, @aajisaka. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tasanuma merged pull request #4310: HDFS-16579. Fix build failure for TestBlockManager on branch-3.2
tasanuma merged PR #4310: URL: https://github.com/apache/hadoop/pull/4310 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] rishabh1704 commented on a diff in pull request #4287: HDFS-16561 : Fixes the bug where strtol() returns error in chmod in hdfs_native tools
rishabh1704 commented on code in PR #4287: URL: https://github.com/apache/hadoop/pull/4287#discussion_r873132445 ## hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tools/hdfs-chmod/hdfs-chmod.cc: ## @@ -140,8 +140,17 @@ bool Chmod::HandlePath(const std::string , const bool recursive, /* * strtol is reading the value with base 8, NULL because we are reading in * just one value. + * + * The strtol function may result in errors so check for that before typecasting. */ - auto perm = static_cast(strtol(permissions.c_str(), nullptr, 8)); + errno = 0; + long result = strtol(permissions.c_str(), nullptr, 8); + /* + * The errno is set to ERANGE incase the string doesn't fit in long + * Also, the result is set to 0, in case conversion is not possible + */ + if ((errno == ERANGE) || result == 0) return false; Review Comment: Sure, i will handle that case as well. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] GauthamBanasandra commented on a diff in pull request #4287: HDFS-16561 : Fixes the bug where strtol() returns error in chmod in hdfs_native tools
GauthamBanasandra commented on code in PR #4287: URL: https://github.com/apache/hadoop/pull/4287#discussion_r873128972 ## hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tools/hdfs-chmod/hdfs-chmod.cc: ## @@ -140,8 +140,17 @@ bool Chmod::HandlePath(const std::string , const bool recursive, /* * strtol is reading the value with base 8, NULL because we are reading in * just one value. + * + * The strtol function may result in errors so check for that before typecasting. */ - auto perm = static_cast(strtol(permissions.c_str(), nullptr, 8)); + errno = 0; + long result = strtol(permissions.c_str(), nullptr, 8); + /* + * The errno is set to ERANGE incase the string doesn't fit in long + * Also, the result is set to 0, in case conversion is not possible + */ + if ((errno == ERANGE) || result == 0) return false; Review Comment: @rishabh1704, This would yield `false` even for those cases where `permissions` passed was 0. Please explore how this can be handled. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org