[GitHub] [hadoop] Likkey closed pull request #4641: HDFS-16697.Randomly setting “dfs.namenode.resource.checked.volumes.minimum” will always prevent safe mode from being turned off
Likkey closed pull request #4641: HDFS-16697.Randomly setting “dfs.namenode.resource.checked.volumes.minimum” will always prevent safe mode from being turned off URL: https://github.com/apache/hadoop/pull/4641 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] Likkey opened a new pull request, #4649: HDFS-16697.Randomly setting “dfs.namenode.resource.checked.volumes.minimum” will always prevent safe mode from being turned off
Likkey opened a new pull request, #4649: URL: https://github.com/apache/hadoop/pull/4649 ### Description of PR Add Precondition.checkArgument() for minimumRedundantVolumes to ensure that the value is greater than the number of NameNode storage volumes to avoid never being able to turn off safe mode afterwards. JIRA:[[HDFS-16697](https://issues.apache.org/jira/browse/HDFS-16697)]( ### How was this patch tested? It is found that “dfs.namenode.resource.checked.volumes.minimum” lacks a condition check and an associated exception handling mechanism, which makes it impossible to find the root cause of the impact when a misconfiguration occurs. This patch provides a check of the configuration items,it will throw a IllegalArgumentException and detailed error message when the value of "dfs.namenode.resource.checked.volumes.minimum" is set greater than the number of NameNode storage volumes to avoid the misconfiguration from affecting the subsequent operation of the program. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 commented on pull request #4559: HDFS-16658. Change logLevel from DEBUG to INFO if logEveryBlock is true
slfan1989 commented on PR #4559: URL: https://github.com/apache/hadoop/pull/4559#issuecomment-1197684967 > The log size is controllable because `logEveryBlock` will control it. And although it's a block-level log, it only printed the log when the replica of the block is changed. > > Maybe we should think about its use. This log is very helpful for us to locate some abnormal case about replica of block, such as complete failure, missing block, etc... Thanks for your explanation, I understand your changes. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 commented on a diff in pull request #4606: HDFS-16678. RBF should supports disable getNodeUsage() in RBFMetrics
slfan1989 commented on code in PR #4606: URL: https://github.com/apache/hadoop/pull/4606#discussion_r931795987 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/metrics/RBFMetrics.java: ## @@ -537,35 +547,34 @@ public int getNumEnteringMaintenanceDataNodes() { @Override // NameNodeMXBean public String getNodeUsage() { -float median = 0; -float max = 0; -float min = 0; -float dev = 0; +double median = 0; +double max = 0; +double min = 0; +double dev = 0; final Map> info = new HashMap<>(); try { - RouterRpcServer rpcServer = this.router.getRpcServer(); - DatanodeInfo[] live = rpcServer.getDatanodeReport( - DatanodeReportType.LIVE, false, timeOut); + DatanodeInfo[] live = null; + if (this.enableGetDNUsage) { +RouterRpcServer rpcServer = this.router.getRpcServer(); +live = rpcServer.getDatanodeReport(DatanodeReportType.LIVE, false, timeOut); + } else { +LOG.debug("Getting node usage is disabled."); + } - if (live.length > 0) { -float totalDfsUsed = 0; -float[] usages = new float[live.length]; + if (live != null && live.length > 0) { +double[] usages = new double[live.length]; int i = 0; for (DatanodeInfo dn : live) { usages[i++] = dn.getDfsUsedPercent(); - totalDfsUsed += dn.getDfsUsedPercent(); } -totalDfsUsed /= live.length; Arrays.sort(usages); median = usages[usages.length / 2]; max = usages[usages.length - 1]; min = usages[0]; -for (i = 0; i < usages.length; i++) { - dev += (usages[i] - totalDfsUsed) * (usages[i] - totalDfsUsed); -} -dev = (float) Math.sqrt(dev / usages.length); +StandardDeviation deviation = new StandardDeviation(); +dev = deviation.evaluate(usages); } } catch (IOException e) { LOG.error("Cannot get the live nodes: {}", e.getMessage()); Review Comment: I feel it would be better this way. ``` LOG.error("Cannot get the live nodes.", e). ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] Likkey closed pull request #4641: HDFS-16697.Randomly setting “dfs.namenode.resource.checked.volumes.minimum” will always prevent safe mode from being turned off
Likkey closed pull request #4641: HDFS-16697.Randomly setting “dfs.namenode.resource.checked.volumes.minimum” will always prevent safe mode from being turned off URL: https://github.com/apache/hadoop/pull/4641 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 commented on a diff in pull request #4597: HDFS-16671. RBF: RouterRpcFairnessPolicyController supports configurable permit acquire timeout
slfan1989 commented on code in PR #4597: URL: https://github.com/apache/hadoop/pull/4597#discussion_r931794170 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/resources/hdfs-rbf-default.xml: ## @@ -723,6 +723,14 @@ + +dfs.federation.router.fairness.acquire.timeout +1s Review Comment: I see, your configuration is accurate. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 commented on a diff in pull request #4597: HDFS-16671. RBF: RouterRpcFairnessPolicyController supports configurable permit acquire timeout
slfan1989 commented on code in PR #4597: URL: https://github.com/apache/hadoop/pull/4597#discussion_r931792540 ## hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/resources/hdfs-rbf-default.xml: ## @@ -723,6 +723,14 @@ + +dfs.federation.router.fairness.acquire.timeout +1s Review Comment: Is this configuration confirmed to be like this? I feel the following is correct ``` 1 ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4403: MAPREDUCE-7385. improve JobEndNotifier#httpNotification With recommended methods
hadoop-yetus commented on PR #4403: URL: https://github.com/apache/hadoop/pull/4403#issuecomment-1197668581 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 12s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 41m 52s | | trunk passed | | +1 :green_heart: | compile | 0m 57s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 0m 52s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 52s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 59s | | trunk passed | | +1 :green_heart: | javadoc | 0m 46s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 37s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 47s | | trunk passed | | +1 :green_heart: | shadedclient | 24m 17s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 41s | | the patch passed | | +1 :green_heart: | compile | 0m 42s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 0m 42s | | hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-core-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 0 new + 102 unchanged - 7 fixed = 102 total (was 109) | | +1 :green_heart: | compile | 0m 39s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 39s | | hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-core-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 generated 0 new + 96 unchanged - 7 fixed = 96 total (was 103) | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 33s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 43s | | the patch passed | | +1 :green_heart: | javadoc | 0m 24s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 23s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 32s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 32s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 7m 18s | | hadoop-mapreduce-client-core in the patch passed. | | +1 :green_heart: | asflicense | 0m 43s | | The patch does not generate ASF License warnings. | | | | 112m 3s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4403/4/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4403 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 77b71d4856b9 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / fc984de8e6e262f425dcbd4c96e46d496184ac77 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4403/4/testReport/ | | Max. process+thread count | (vs. ulimit of 5500) | | modules | C: hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core U:
[jira] [Work logged] (HADOOP-18362) Running org.apache.hadoop.ha.TestZKFailoverController when "hadoop.security.groups.cache.secs" is zero or negative numbers will throw ambiguous exception
[ https://issues.apache.org/jira/browse/HADOOP-18362?focusedWorklogId=795905=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795905 ] ASF GitHub Bot logged work on HADOOP-18362: --- Author: ASF GitHub Bot Created on: 28/Jul/22 04:27 Start Date: 28/Jul/22 04:27 Worklog Time Spent: 10m Work Description: MEILIDEKCL opened a new pull request, #4648: URL: https://github.com/apache/hadoop/pull/4648 ### Description of PR We modified the Groups.java file, manually checked the cache validity time in the initialization function of this file, and then threw an exception at an earlier position. Because what is thrown in ZKFailoverController.java is the cause of the exception information, which is RuntimeException.getCause(). Also, the cache validity time is used when creating a cacheBuilder. However, the check information in this cacheBuilder class is imperfect, so it leads to imperfection of the thrown information. ### How was this patch tested? ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? Issue Time Tracking --- Worklog Id: (was: 795905) Time Spent: 3h 10m (was: 3h) > Running org.apache.hadoop.ha.TestZKFailoverController when > "hadoop.security.groups.cache.secs" is zero or negative numbers will throw > ambiguous exception > - > > Key: HADOOP-18362 > URL: https://issues.apache.org/jira/browse/HADOOP-18362 > Project: Hadoop Common > Issue Type: Improvement >Affects Versions: 2.10.2 > Environment: Linux version 4.15.0-142-generic > (buildd@lgw01-amd64-039) (gcc version 5.4.0 20160609 (Ubuntu > 5.4.0-6ubuntu1~16.04.12)) >Reporter: Jingxuan Fu >Assignee: Jingxuan Fu >Priority: Major > Labels: pull-request-available > Time Spent: 3h 10m > Remaining Estimate: 0h > > {quote} > {code:java} > > hadoop.security.groups.cache.secs > 300 > > This is the config controlling the validity of the entries in the cache > containing the user->group mapping. When this duration has expired, > then the implementation of the group mapping provider is invoked to get > the groups of the user and then cached back. > > {code} > {quote} > As we see in core-default.xml of hadoop.security.groups.cache.secs, the > default value is 300. But when we set it to zero or negative number and then > run > org.apache.hadoop.ha.TestZKFailoverController#testGracefulFailoverMultipleZKfcs, > it will throw NullPointerException as below: > {quote} > {code:java} > [INFO] Running org.apache.hadoop.ha.TestZKFailoverController > [ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.932 > s <<< FAILURE! - in org.apache.hadoop.ha.TestZKFailoverController > [ERROR] > testGracefulFailoverMultipleZKfcs(org.apache.hadoop.ha.TestZKFailoverController) > Time elapsed: 0.799 s <<< ERROR! > java.lang.NullPointerException > at > org.apache.hadoop.ha.ZKFailoverController.run(ZKFailoverController.java:188) > at > org.apache.hadoop.ha.MiniZKFCCluster.start(MiniZKFCCluster.java:116) > at > org.apache.hadoop.ha.TestZKFailoverController.testGracefulFailoverMultipleZKfcs(TestZKFailoverController.java:581) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) > at > org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) > at > org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) > at > org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) > at > org.apache.zookeeper.JUnit4ZKTestRunner$LoggedInvokeMethod.evaluate(JUnit4ZKTestRunner.java:55) > at >
[GitHub] [hadoop] MEILIDEKCL opened a new pull request, #4648: HADOOP-18362. Solve ZKFailoverController throw ambiguous exception
MEILIDEKCL opened a new pull request, #4648: URL: https://github.com/apache/hadoop/pull/4648 ### Description of PR We modified the Groups.java file, manually checked the cache validity time in the initialization function of this file, and then threw an exception at an earlier position. Because what is thrown in ZKFailoverController.java is the cause of the exception information, which is RuntimeException.getCause(). Also, the cache validity time is used when creating a cacheBuilder. However, the check information in this cacheBuilder class is imperfect, so it leads to imperfection of the thrown information. ### How was this patch tested? ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18362) Running org.apache.hadoop.ha.TestZKFailoverController when "hadoop.security.groups.cache.secs" is zero or negative numbers will throw ambiguous exception
[ https://issues.apache.org/jira/browse/HADOOP-18362?focusedWorklogId=795903=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795903 ] ASF GitHub Bot logged work on HADOOP-18362: --- Author: ASF GitHub Bot Created on: 28/Jul/22 04:11 Start Date: 28/Jul/22 04:11 Worklog Time Spent: 10m Work Description: MEILIDEKCL closed pull request #4643: HADOOP-18362. Solve ZKFailoverController throw ambiguous exception URL: https://github.com/apache/hadoop/pull/4643 Issue Time Tracking --- Worklog Id: (was: 795903) Time Spent: 3h (was: 2h 50m) > Running org.apache.hadoop.ha.TestZKFailoverController when > "hadoop.security.groups.cache.secs" is zero or negative numbers will throw > ambiguous exception > - > > Key: HADOOP-18362 > URL: https://issues.apache.org/jira/browse/HADOOP-18362 > Project: Hadoop Common > Issue Type: Improvement >Affects Versions: 2.10.2 > Environment: Linux version 4.15.0-142-generic > (buildd@lgw01-amd64-039) (gcc version 5.4.0 20160609 (Ubuntu > 5.4.0-6ubuntu1~16.04.12)) >Reporter: Jingxuan Fu >Assignee: Jingxuan Fu >Priority: Major > Labels: pull-request-available > Time Spent: 3h > Remaining Estimate: 0h > > {quote} > {code:java} > > hadoop.security.groups.cache.secs > 300 > > This is the config controlling the validity of the entries in the cache > containing the user->group mapping. When this duration has expired, > then the implementation of the group mapping provider is invoked to get > the groups of the user and then cached back. > > {code} > {quote} > As we see in core-default.xml of hadoop.security.groups.cache.secs, the > default value is 300. But when we set it to zero or negative number and then > run > org.apache.hadoop.ha.TestZKFailoverController#testGracefulFailoverMultipleZKfcs, > it will throw NullPointerException as below: > {quote} > {code:java} > [INFO] Running org.apache.hadoop.ha.TestZKFailoverController > [ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.932 > s <<< FAILURE! - in org.apache.hadoop.ha.TestZKFailoverController > [ERROR] > testGracefulFailoverMultipleZKfcs(org.apache.hadoop.ha.TestZKFailoverController) > Time elapsed: 0.799 s <<< ERROR! > java.lang.NullPointerException > at > org.apache.hadoop.ha.ZKFailoverController.run(ZKFailoverController.java:188) > at > org.apache.hadoop.ha.MiniZKFCCluster.start(MiniZKFCCluster.java:116) > at > org.apache.hadoop.ha.TestZKFailoverController.testGracefulFailoverMultipleZKfcs(TestZKFailoverController.java:581) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) > at > org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) > at > org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) > at > org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) > at > org.apache.zookeeper.JUnit4ZKTestRunner$LoggedInvokeMethod.evaluate(JUnit4ZKTestRunner.java:55) > at > org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) > at > org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) > at org.junit.rules.TestWatchman$1.evaluate(TestWatchman.java:53) > at > org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:299) > at > org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:293) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > at java.lang.Thread.run(Thread.java:748){code} > {quote} > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] MEILIDEKCL closed pull request #4643: HADOOP-18362. Solve ZKFailoverController throw ambiguous exception
MEILIDEKCL closed pull request #4643: HADOOP-18362. Solve ZKFailoverController throw ambiguous exception URL: https://github.com/apache/hadoop/pull/4643 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ZanderXu commented on pull request #4518: HDFS-16645. [JN] Bugfix: java.lang.IllegalStateException: Invalid log manifest
ZanderXu commented on PR #4518: URL: https://github.com/apache/hadoop/pull/4518#issuecomment-1197618823 @xkrogen Master, can you help me review this patch? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ZanderXu commented on pull request #4628: HDFS-16689. NameNode may crash when transitioning to Active with in-progress tailer if there are some abnormal JNs.
ZanderXu commented on PR #4628: URL: https://github.com/apache/hadoop/pull/4628#issuecomment-1197618509 @xkrogen Master, can you help me review this patch? Thanks -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ZanderXu commented on pull request #4560: HDFS-16659. JournalNode should throw CacheMissException when SinceTxId is bigger than HighestWrittenTxId
ZanderXu commented on PR #4560: URL: https://github.com/apache/hadoop/pull/4560#issuecomment-1197618194 @xkrogen Master, can you help me review this patch? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Created] (HADOOP-18375) Fix failure of shelltest for hadoop_add_ldlibpath
Masatake Iwasaki created HADOOP-18375: - Summary: Fix failure of shelltest for hadoop_add_ldlibpath Key: HADOOP-18375 URL: https://issues.apache.org/jira/browse/HADOOP-18375 Project: Hadoop Common Issue Type: Bug Components: test Reporter: Masatake Iwasaki Assignee: Masatake Iwasaki Test cases in hadoop_add_ldlibpath.bats failed. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18375) Fix failure of shelltest for hadoop_add_ldlibpath
[ https://issues.apache.org/jira/browse/HADOOP-18375?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17572214#comment-17572214 ] Masatake Iwasaki commented on HADOOP-18375: --- {noformat} [exec] not ok 3 hadoop_add_ldlibpath (simple dupecheck) [exec] # (from function `hadoop_add_colonpath' in file ../../main/bin/hadoop-functions.sh, line 1251, [exec] # from function `hadoop_add_ldlibpath' in file ../../main/bin/hadoop-functions.sh, line 1286, [exec] # in test file hadoop_add_ldlibpath.bats, line 32) [exec] # `hadoop_add_ldlibpath "${TMP}"' failed [exec] # bindir: /home/rocky/srcs/hadoop/hadoop-common-project/hadoop-common/src/test/scripts [exec] # DEBUG: Append colonpath(LD_LIBRARY_PATH): /home/rocky/srcs/hadoop/hadoop-common-project/hadoop-common/target/test-dir/bats.86220.1498 [exec] # DEBUG: Rejected colonpath(LD_LIBRARY_PATH): /home/rocky/srcs/hadoop/hadoop-common-project/hadoop-common/target/test-dir/bats.86220.1498 [exec] # >/opt/rh/gcc-toolset-9/root/usr/lib64:/opt/rh/gcc-toolset-9/root/usr/lib:/opt/rh/gcc-toolset-9/root/usr/lib64/dyninst:/opt/rh/gcc-toolset-9/root/usr/lib/dyninst:/opt/rh/gcc-toolset-9/root/usr/lib64:/opt/rh/gcc-toolset-9/root/usr/lib:/home/rocky/srcs/hadoop/hadoop-common-project/hadoop-common/target/test-dir/bats.86220.1498< [exec] not ok 4 hadoop_add_ldlibpath (default order) [exec] # (in test file hadoop_add_ldlibpath.bats, line 42) [exec] # `[ "${LD_LIBRARY_PATH}" = "${TMP}:/tmp" ]' failed [exec] # bindir: /home/rocky/srcs/hadoop/hadoop-common-project/hadoop-common/src/test/scripts [exec] # DEBUG: Append colonpath(LD_LIBRARY_PATH): /home/rocky/srcs/hadoop/hadoop-common-project/hadoop-common/target/test-dir/bats.86234.14082 [exec] # DEBUG: Append colonpath(LD_LIBRARY_PATH): /tmp [exec] # >/opt/rh/gcc-toolset-9/root/usr/lib64:/opt/rh/gcc-toolset-9/root/usr/lib:/opt/rh/gcc-toolset-9/root/usr/lib64/dyninst:/opt/rh/gcc-toolset-9/root/usr/lib/dyninst:/opt/rh/gcc-toolset-9/root/usr/lib64:/opt/rh/gcc-toolset-9/root/usr/lib:/home/rocky/srcs/hadoop/hadoop-common-project/hadoop-common/target/test-dir/bats.86234.14082:/tmp< [exec] not ok 5 hadoop_add_ldlibpath (after order) [exec] # (in test file hadoop_add_ldlibpath.bats, line 49) [exec] # `[ "${LD_LIBRARY_PATH}" = "${TMP}:/tmp" ]' failed [exec] # bindir: /home/rocky/srcs/hadoop/hadoop-common-project/hadoop-common/src/test/scripts [exec] # DEBUG: Append colonpath(LD_LIBRARY_PATH): /home/rocky/srcs/hadoop/hadoop-common-project/hadoop-common/target/test-dir/bats.86248.22235 [exec] # DEBUG: Append colonpath(LD_LIBRARY_PATH): /tmp [exec] # >/opt/rh/gcc-toolset-9/root/usr/lib64:/opt/rh/gcc-toolset-9/root/usr/lib:/opt/rh/gcc-toolset-9/root/usr/lib64/dyninst:/opt/rh/gcc-toolset-9/root/usr/lib/dyninst:/opt/rh/gcc-toolset-9/root/usr/lib64:/opt/rh/gcc-toolset-9/root/usr/lib:/home/rocky/srcs/hadoop/hadoop-common-project/hadoop-common/target/test-dir/bats.86248.22235:/tmp< [exec] not ok 6 hadoop_add_ldlibpath (before order) [exec] # (in test file hadoop_add_ldlibpath.bats, line 56) [exec] # `[ "${LD_LIBRARY_PATH}" = "/tmp:${TMP}" ]' failed [exec] # bindir: /home/rocky/srcs/hadoop/hadoop-common-project/hadoop-common/src/test/scripts [exec] # DEBUG: Append colonpath(LD_LIBRARY_PATH): /home/rocky/srcs/hadoop/hadoop-common-project/hadoop-common/target/test-dir/bats.86262.28642 [exec] # DEBUG: Prepend colonpath(LD_LIBRARY_PATH): /tmp [exec] # >/tmp:/opt/rh/gcc-toolset-9/root/usr/lib64:/opt/rh/gcc-toolset-9/root/usr/lib:/opt/rh/gcc-toolset-9/root/usr/lib64/dyninst:/opt/rh/gcc-toolset-9/root/usr/lib/dyninst:/opt/rh/gcc-toolset-9/root/usr/lib64:/opt/rh/gcc-toolset-9/root/usr/lib:/home/rocky/srcs/hadoop/hadoop-common-project/hadoop-common/target/test-dir/bats.86262.28642< [exec] not ok 7 hadoop_add_ldlibpath (simple dupecheck 2) [exec] # (from function `hadoop_add_colonpath' in file ../../main/bin/hadoop-functions.sh, line 1251, [exec] # from function `hadoop_add_ldlibpath' in file ../../main/bin/hadoop-functions.sh, line 1286, [exec] # in test file hadoop_add_ldlibpath.bats, line 63) [exec] # `hadoop_add_ldlibpath "${TMP}"' failed [exec] # bindir: /home/rocky/srcs/hadoop/hadoop-common-project/hadoop-common/src/test/scripts [exec] # DEBUG: Append colonpath(LD_LIBRARY_PATH): /home/rocky/srcs/hadoop/hadoop-common-project/hadoop-common/target/test-dir/bats.86276.18537 [exec] # DEBUG: Append colonpath(LD_LIBRARY_PATH): /tmp [exec] # DEBUG: Rejected colonpath(LD_LIBRARY_PATH): /home/rocky/srcs/hadoop/hadoop-common-project/hadoop-common/target/test-dir/bats.86276.18537 [exec] #
[jira] [Work logged] (HADOOP-18327) Fix eval expression in hadoop-functions.sh
[ https://issues.apache.org/jira/browse/HADOOP-18327?focusedWorklogId=795893=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795893 ] ASF GitHub Bot logged work on HADOOP-18327: --- Author: ASF GitHub Bot Created on: 28/Jul/22 03:26 Start Date: 28/Jul/22 03:26 Worklog Time Spent: 10m Work Description: iwasakims commented on PR #4536: URL: https://github.com/apache/hadoop/pull/4536#issuecomment-1197611706 You can run the tests by `mvn test -Pshelltest -Dtest=x` on hadoop-common-project/hadoop-common. (You need to install bats for this.) Issue Time Tracking --- Worklog Id: (was: 795893) Time Spent: 1h 40m (was: 1.5h) > Fix eval expression in hadoop-functions.sh > -- > > Key: HADOOP-18327 > URL: https://issues.apache.org/jira/browse/HADOOP-18327 > Project: Hadoop Common > Issue Type: Bug >Affects Versions: 3.3.3 >Reporter: groot >Assignee: groot >Priority: Minor > Labels: pull-request-available > Time Spent: 1h 40m > Remaining Estimate: 0h > > Need to fix the eval expression. > 1. Prefix exec by eval in Hadoop bin scripts Prior to this change, if > HADOOP_OPTS contains any arguments that include a space, the command is not > parsed correctly. For example, if HADOOP_OPTS="... > -XX:OnOutOfMemoryError=\"kill -9 %p\" ...", the bin/hadoop script will fail > with the error "Unrecognized option: -9". No amount of clever escaping of the > quotes or spaces in the "kill -9 %p" command will fix this. The only > alternative appears to be to use 'eval'. Switching to use 'eval' *instead of* > 'exec' also works, but it results in an intermediate bash process being left > alive throughout the entire lifetime of the Java proces being started. Using > 'exec' prefixed by 'eval' as has been done in this commit gets the best of > both worlds, in that options with spaces are parsed correctly, and you don't > end up with an intermediate bash process as the parent of the Java process. > 2. We can replace single quote with escape-char and a double quote. > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] iwasakims commented on pull request #4536: HADOOP-18327.Fix eval expression in hadoop-functions.sh
iwasakims commented on PR #4536: URL: https://github.com/apache/hadoop/pull/4536#issuecomment-1197611706 You can run the tests by `mvn test -Pshelltest -Dtest=x` on hadoop-common-project/hadoop-common. (You need to install bats for this.) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18327) Fix eval expression in hadoop-functions.sh
[ https://issues.apache.org/jira/browse/HADOOP-18327?focusedWorklogId=795892=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795892 ] ASF GitHub Bot logged work on HADOOP-18327: --- Author: ASF GitHub Bot Created on: 28/Jul/22 03:19 Start Date: 28/Jul/22 03:19 Worklog Time Spent: 10m Work Description: iwasakims commented on PR #4536: URL: https://github.com/apache/hadoop/pull/4536#issuecomment-1197607973 It would be nice if you can add test cases to files under hadoop-common-project/hadoop-common/src/test/scripts in order to see if no regression. Issue Time Tracking --- Worklog Id: (was: 795892) Time Spent: 1.5h (was: 1h 20m) > Fix eval expression in hadoop-functions.sh > -- > > Key: HADOOP-18327 > URL: https://issues.apache.org/jira/browse/HADOOP-18327 > Project: Hadoop Common > Issue Type: Bug >Affects Versions: 3.3.3 >Reporter: groot >Assignee: groot >Priority: Minor > Labels: pull-request-available > Time Spent: 1.5h > Remaining Estimate: 0h > > Need to fix the eval expression. > 1. Prefix exec by eval in Hadoop bin scripts Prior to this change, if > HADOOP_OPTS contains any arguments that include a space, the command is not > parsed correctly. For example, if HADOOP_OPTS="... > -XX:OnOutOfMemoryError=\"kill -9 %p\" ...", the bin/hadoop script will fail > with the error "Unrecognized option: -9". No amount of clever escaping of the > quotes or spaces in the "kill -9 %p" command will fix this. The only > alternative appears to be to use 'eval'. Switching to use 'eval' *instead of* > 'exec' also works, but it results in an intermediate bash process being left > alive throughout the entire lifetime of the Java proces being started. Using > 'exec' prefixed by 'eval' as has been done in this commit gets the best of > both worlds, in that options with spaces are parsed correctly, and you don't > end up with an intermediate bash process as the parent of the Java process. > 2. We can replace single quote with escape-char and a double quote. > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] iwasakims commented on pull request #4536: HADOOP-18327.Fix eval expression in hadoop-functions.sh
iwasakims commented on PR #4536: URL: https://github.com/apache/hadoop/pull/4536#issuecomment-1197607973 It would be nice if you can add test cases to files under hadoop-common-project/hadoop-common/src/test/scripts in order to see if no regression. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 commented on pull request #4403: MAPREDUCE-7385. improve JobEndNotifier#httpNotification With recommended methods
slfan1989 commented on PR #4403: URL: https://github.com/apache/hadoop/pull/4403#issuecomment-1197607166 @jojochuang Can you help review this pr? this pr replaces some deprecated methods, the changes are very small, thank you very much! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 commented on pull request #4406: HDFS-16619. Fix HttpHeaders.Values And HttpHeaders.Names Deprecated Import
slfan1989 commented on PR #4406: URL: https://github.com/apache/hadoop/pull/4406#issuecomment-1197603105 @jojochuang Thank you very much! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] jojochuang merged pull request #4406: HDFS-16619. Fix HttpHeaders.Values And HttpHeaders.Names Deprecated Import
jojochuang merged PR #4406: URL: https://github.com/apache/hadoop/pull/4406 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18327) Fix eval expression in hadoop-functions.sh
[ https://issues.apache.org/jira/browse/HADOOP-18327?focusedWorklogId=795887=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795887 ] ASF GitHub Bot logged work on HADOOP-18327: --- Author: ASF GitHub Bot Created on: 28/Jul/22 02:48 Start Date: 28/Jul/22 02:48 Worklog Time Spent: 10m Work Description: iwasakims commented on PR #4536: URL: https://github.com/apache/hadoop/pull/4536#issuecomment-1197590538 Hmm, hadoop_add_param did not worked if the value contains `"`. (`HADOOP_OPTS='-XX:OnOutOfMemoryError="kill -9 %p"'` instead of previous `HADOOP_OPTS="-XX:OnOutOfMemoryError='kill -9 %p'"`) @ashutoshcipher ``` $ export HADOOP_OPTS='-XX:OnOutOfMemoryError="kill -9 %p"' $ echo $HADOOP_OPTS -XX:OnOutOfMemoryError="kill -9 %p" $ bash -x -c 'export SHELLOPTS && bin/hadoop version' 2>&1 | less ... + hadoop_add_param HADOOP_OPTS yarn.log.dir -Dyarn.log.dir=/home/rocky/dist/hadoop-3.4.0-SNAPSHOT/logs + [[ ! -XX:OnOutOfMemoryError="kill -9 %p" =~ yarn.log.dir ]] + eval 'HADOOP_OPTS="-XX:OnOutOfMemoryError="kill -9 %p" -Dyarn.log.dir=/home/rocky/dist/hadoop-3.4.0-SNAPSHOT/logs"' ++ HADOOP_OPTS=-XX:OnOutOfMemoryError=kill ++ -9 '%p -Dyarn.log.dir=/home/rocky/dist/hadoop-3.4.0-SNAPSHOT/logs' /home/rocky/dist/hadoop-3.4.0-SNAPSHOT/bin/../libexec/hadoop-functions.sh: line 1143: -9: command not found ... ``` Issue Time Tracking --- Worklog Id: (was: 795887) Time Spent: 1h 20m (was: 1h 10m) > Fix eval expression in hadoop-functions.sh > -- > > Key: HADOOP-18327 > URL: https://issues.apache.org/jira/browse/HADOOP-18327 > Project: Hadoop Common > Issue Type: Bug >Affects Versions: 3.3.3 >Reporter: groot >Assignee: groot >Priority: Minor > Labels: pull-request-available > Time Spent: 1h 20m > Remaining Estimate: 0h > > Need to fix the eval expression. > 1. Prefix exec by eval in Hadoop bin scripts Prior to this change, if > HADOOP_OPTS contains any arguments that include a space, the command is not > parsed correctly. For example, if HADOOP_OPTS="... > -XX:OnOutOfMemoryError=\"kill -9 %p\" ...", the bin/hadoop script will fail > with the error "Unrecognized option: -9". No amount of clever escaping of the > quotes or spaces in the "kill -9 %p" command will fix this. The only > alternative appears to be to use 'eval'. Switching to use 'eval' *instead of* > 'exec' also works, but it results in an intermediate bash process being left > alive throughout the entire lifetime of the Java proces being started. Using > 'exec' prefixed by 'eval' as has been done in this commit gets the best of > both worlds, in that options with spaces are parsed correctly, and you don't > end up with an intermediate bash process as the parent of the Java process. > 2. We can replace single quote with escape-char and a double quote. > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] iwasakims commented on pull request #4536: HADOOP-18327.Fix eval expression in hadoop-functions.sh
iwasakims commented on PR #4536: URL: https://github.com/apache/hadoop/pull/4536#issuecomment-1197590538 Hmm, hadoop_add_param did not worked if the value contains `"`. (`HADOOP_OPTS='-XX:OnOutOfMemoryError="kill -9 %p"'` instead of previous `HADOOP_OPTS="-XX:OnOutOfMemoryError='kill -9 %p'"`) @ashutoshcipher ``` $ export HADOOP_OPTS='-XX:OnOutOfMemoryError="kill -9 %p"' $ echo $HADOOP_OPTS -XX:OnOutOfMemoryError="kill -9 %p" $ bash -x -c 'export SHELLOPTS && bin/hadoop version' 2>&1 | less ... + hadoop_add_param HADOOP_OPTS yarn.log.dir -Dyarn.log.dir=/home/rocky/dist/hadoop-3.4.0-SNAPSHOT/logs + [[ ! -XX:OnOutOfMemoryError="kill -9 %p" =~ yarn.log.dir ]] + eval 'HADOOP_OPTS="-XX:OnOutOfMemoryError="kill -9 %p" -Dyarn.log.dir=/home/rocky/dist/hadoop-3.4.0-SNAPSHOT/logs"' ++ HADOOP_OPTS=-XX:OnOutOfMemoryError=kill ++ -9 '%p -Dyarn.log.dir=/home/rocky/dist/hadoop-3.4.0-SNAPSHOT/logs' /home/rocky/dist/hadoop-3.4.0-SNAPSHOT/bin/../libexec/hadoop-functions.sh: line 1143: -9: command not found ... ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 commented on pull request #4406: HDFS-16619. Fix HttpHeaders.Values And HttpHeaders.Names Deprecated Import
slfan1989 commented on PR #4406: URL: https://github.com/apache/hadoop/pull/4406#issuecomment-1197590384 @jojochuang Thank you very much for helping to review the code, please help to review the code again. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] goiri merged pull request #4561: HDFS-16660. Improve Code With Lambda in IPCLoggerChannel class
goiri merged PR #4561: URL: https://github.com/apache/hadoop/pull/4561 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18345) Enhance client protocol to propagate last seen state IDs for multiple nameservices.
[ https://issues.apache.org/jira/browse/HADOOP-18345?focusedWorklogId=795872=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795872 ] ASF GitHub Bot logged work on HADOOP-18345: --- Author: ASF GitHub Bot Created on: 28/Jul/22 01:48 Start Date: 28/Jul/22 01:48 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4584: URL: https://github.com/apache/hadoop/pull/4584#issuecomment-1197558262 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 14s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | buf | 0m 1s | | buf was not available. | | +0 :ok: | buf | 0m 1s | | buf was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 39m 24s | | trunk passed | | +1 :green_heart: | compile | 25m 21s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 23m 4s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 34s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 3s | | trunk passed | | +1 :green_heart: | javadoc | 1m 39s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 15s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 9s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 19s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 9s | | the patch passed | | +1 :green_heart: | compile | 23m 51s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | cc | 23m 51s | | the patch passed | | +1 :green_heart: | javac | 23m 51s | | the patch passed | | +1 :green_heart: | compile | 22m 29s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | cc | 22m 29s | | the patch passed | | +1 :green_heart: | javac | 22m 29s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 32s | | hadoop-common-project/hadoop-common: The patch generated 0 new + 164 unchanged - 1 fixed = 164 total (was 165) | | +1 :green_heart: | mvnsite | 2m 0s | | the patch passed | | -1 :x: | javadoc | 1m 31s | [/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4584/3/artifact/out/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 18 new + 0 unchanged - 0 fixed = 18 total (was 0) | | +1 :green_heart: | javadoc | 1m 16s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | -1 :x: | spotbugs | 3m 10s | [/new-spotbugs-hadoop-common-project_hadoop-common.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4584/3/artifact/out/new-spotbugs-hadoop-common-project_hadoop-common.html) | hadoop-common-project/hadoop-common generated 2 new + 0 unchanged - 0 fixed = 2 total (was 0) | | +1 :green_heart: | shadedclient | 25m 23s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 58s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 36s | | The patch does not generate ASF License warnings. | | | | 227m 51s | | | | Reason | Tests | |---:|:--| | SpotBugs | module:hadoop-common-project/hadoop-common | | | Class
[GitHub] [hadoop] hadoop-yetus commented on pull request #4584: HADOOP-18345: Enhance client protocol to propagate last seen state IDs for multiple nameservices.
hadoop-yetus commented on PR #4584: URL: https://github.com/apache/hadoop/pull/4584#issuecomment-1197558262 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 14s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | buf | 0m 1s | | buf was not available. | | +0 :ok: | buf | 0m 1s | | buf was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 39m 24s | | trunk passed | | +1 :green_heart: | compile | 25m 21s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 23m 4s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 34s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 3s | | trunk passed | | +1 :green_heart: | javadoc | 1m 39s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 15s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 9s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 19s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 9s | | the patch passed | | +1 :green_heart: | compile | 23m 51s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | cc | 23m 51s | | the patch passed | | +1 :green_heart: | javac | 23m 51s | | the patch passed | | +1 :green_heart: | compile | 22m 29s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | cc | 22m 29s | | the patch passed | | +1 :green_heart: | javac | 22m 29s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 32s | | hadoop-common-project/hadoop-common: The patch generated 0 new + 164 unchanged - 1 fixed = 164 total (was 165) | | +1 :green_heart: | mvnsite | 2m 0s | | the patch passed | | -1 :x: | javadoc | 1m 31s | [/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4584/3/artifact/out/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 18 new + 0 unchanged - 0 fixed = 18 total (was 0) | | +1 :green_heart: | javadoc | 1m 16s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | -1 :x: | spotbugs | 3m 10s | [/new-spotbugs-hadoop-common-project_hadoop-common.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4584/3/artifact/out/new-spotbugs-hadoop-common-project_hadoop-common.html) | hadoop-common-project/hadoop-common generated 2 new + 0 unchanged - 0 fixed = 2 total (was 0) | | +1 :green_heart: | shadedclient | 25m 23s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 58s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 36s | | The patch does not generate ASF License warnings. | | | | 227m 51s | | | | Reason | Tests | |---:|:--| | SpotBugs | module:hadoop-common-project/hadoop-common | | | Class org.apache.hadoop.hdfs.server.federation.router.RouterFederatedState$RouterFederatedStateProto defines non-transient non-serializable instance field namespaceStateIds_ In RouterFederatedState.java:instance field namespaceStateIds_ In RouterFederatedState.java | | | Useless control flow in org.apache.hadoop.hdfs.server.federation.router.RouterFederatedState$RouterFederatedStateProto$Builder.maybeForceBuilderInitialization() At
[jira] [Work logged] (HADOOP-18345) Enhance client protocol to propagate last seen state IDs for multiple nameservices.
[ https://issues.apache.org/jira/browse/HADOOP-18345?focusedWorklogId=795868=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795868 ] ASF GitHub Bot logged work on HADOOP-18345: --- Author: ASF GitHub Bot Created on: 28/Jul/22 01:39 Start Date: 28/Jul/22 01:39 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4584: URL: https://github.com/apache/hadoop/pull/4584#issuecomment-1197553056 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 57s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | buf | 0m 1s | | buf was not available. | | +0 :ok: | buf | 0m 1s | | buf was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 40m 55s | | trunk passed | | +1 :green_heart: | compile | 25m 9s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 21m 58s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 31s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 0s | | trunk passed | | +1 :green_heart: | javadoc | 1m 32s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 5s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 7s | | trunk passed | | +1 :green_heart: | shadedclient | 26m 10s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 5s | | the patch passed | | +1 :green_heart: | compile | 24m 22s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | cc | 24m 22s | | the patch passed | | +1 :green_heart: | javac | 24m 22s | | the patch passed | | +1 :green_heart: | compile | 21m 57s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | cc | 21m 57s | | the patch passed | | +1 :green_heart: | javac | 21m 57s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 27s | | hadoop-common-project/hadoop-common: The patch generated 0 new + 164 unchanged - 1 fixed = 164 total (was 165) | | +1 :green_heart: | mvnsite | 1m 57s | | the patch passed | | -1 :x: | javadoc | 1m 22s | [/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4584/2/artifact/out/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 18 new + 0 unchanged - 0 fixed = 18 total (was 0) | | +1 :green_heart: | javadoc | 1m 7s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | -1 :x: | spotbugs | 3m 5s | [/new-spotbugs-hadoop-common-project_hadoop-common.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4584/2/artifact/out/new-spotbugs-hadoop-common-project_hadoop-common.html) | hadoop-common-project/hadoop-common generated 2 new + 0 unchanged - 0 fixed = 2 total (was 0) | | +1 :green_heart: | shadedclient | 26m 23s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 27s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 16s | | The patch does not generate ASF License warnings. | | | | 227m 9s | | | | Reason | Tests | |---:|:--| | SpotBugs | module:hadoop-common-project/hadoop-common | | | Class
[GitHub] [hadoop] hadoop-yetus commented on pull request #4584: HADOOP-18345: Enhance client protocol to propagate last seen state IDs for multiple nameservices.
hadoop-yetus commented on PR #4584: URL: https://github.com/apache/hadoop/pull/4584#issuecomment-1197553056 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 57s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | buf | 0m 1s | | buf was not available. | | +0 :ok: | buf | 0m 1s | | buf was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 40m 55s | | trunk passed | | +1 :green_heart: | compile | 25m 9s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 21m 58s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 31s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 0s | | trunk passed | | +1 :green_heart: | javadoc | 1m 32s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 5s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 7s | | trunk passed | | +1 :green_heart: | shadedclient | 26m 10s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 5s | | the patch passed | | +1 :green_heart: | compile | 24m 22s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | cc | 24m 22s | | the patch passed | | +1 :green_heart: | javac | 24m 22s | | the patch passed | | +1 :green_heart: | compile | 21m 57s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | cc | 21m 57s | | the patch passed | | +1 :green_heart: | javac | 21m 57s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 27s | | hadoop-common-project/hadoop-common: The patch generated 0 new + 164 unchanged - 1 fixed = 164 total (was 165) | | +1 :green_heart: | mvnsite | 1m 57s | | the patch passed | | -1 :x: | javadoc | 1m 22s | [/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4584/2/artifact/out/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-common-project_hadoop-common-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 18 new + 0 unchanged - 0 fixed = 18 total (was 0) | | +1 :green_heart: | javadoc | 1m 7s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | -1 :x: | spotbugs | 3m 5s | [/new-spotbugs-hadoop-common-project_hadoop-common.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4584/2/artifact/out/new-spotbugs-hadoop-common-project_hadoop-common.html) | hadoop-common-project/hadoop-common generated 2 new + 0 unchanged - 0 fixed = 2 total (was 0) | | +1 :green_heart: | shadedclient | 26m 23s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 27s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 16s | | The patch does not generate ASF License warnings. | | | | 227m 9s | | | | Reason | Tests | |---:|:--| | SpotBugs | module:hadoop-common-project/hadoop-common | | | Class org.apache.hadoop.hdfs.server.federation.router.RouterFederatedState$RouterFederatedStateProto defines non-transient non-serializable instance field namespaceStateIds_ In RouterFederatedState.java:instance field namespaceStateIds_ In RouterFederatedState.java | | | Useless control flow in org.apache.hadoop.hdfs.server.federation.router.RouterFederatedState$RouterFederatedStateProto$Builder.maybeForceBuilderInitialization() At
[GitHub] [hadoop] slfan1989 commented on a diff in pull request #4406: HDFS-16619. Fix HttpHeaders.Values And HttpHeaders.Names Deprecated Import
slfan1989 commented on code in PR #4406: URL: https://github.com/apache/hadoop/pull/4406#discussion_r931704265 ## hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/server/datanode/web/webhdfs/TestDataNodeUGIProvider.java: ## @@ -246,9 +246,9 @@ private WebHdfsFileSystem getWebHdfsFileSystem(UserGroupInformation ugi, DelegationTokenSecretManager dtSecretManager = new DelegationTokenSecretManager( 8640, 8640, 8640, 8640, namesystem); dtSecretManager.startThreads(); - Token token1 = new Token( + Token token1 = new Token<>( Review Comment: I will fix it. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] slfan1989 commented on a diff in pull request #4406: HDFS-16619. Fix HttpHeaders.Values And HttpHeaders.Names Deprecated Import
slfan1989 commented on code in PR #4406: URL: https://github.com/apache/hadoop/pull/4406#discussion_r931703979 ## hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/server/datanode/web/TestDatanodeHttpXFrame.java: ## @@ -69,7 +68,7 @@ public void testNameNodeXFrameOptionsDisabled() throws Exception { cluster = createCluster(xFrameEnabled, null); HttpURLConnection conn = getConn(cluster); String xfoHeader = conn.getHeaderField("X-FRAME-OPTIONS"); -Assert.assertTrue("unexpected X-FRAME-OPTION in header", xfoHeader == null); +Assert.assertNull(xfoHeader); Review Comment: @jojochuang Thank you very much for helping to review the code, I will modify the code! -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18355) Update previous index properly while validating overlapping ranges.
[ https://issues.apache.org/jira/browse/HADOOP-18355?focusedWorklogId=795865=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795865 ] ASF GitHub Bot logged work on HADOOP-18355: --- Author: ASF GitHub Bot Created on: 28/Jul/22 01:24 Start Date: 28/Jul/22 01:24 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4647: URL: https://github.com/apache/hadoop/pull/4647#issuecomment-1197545407 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 12s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 41m 22s | | trunk passed | | +1 :green_heart: | compile | 25m 23s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 21m 57s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 30s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 59s | | trunk passed | | +1 :green_heart: | javadoc | 1m 33s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 6s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 7s | | trunk passed | | +1 :green_heart: | shadedclient | 26m 17s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 6s | | the patch passed | | +1 :green_heart: | compile | 24m 29s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 24m 29s | | the patch passed | | +1 :green_heart: | compile | 21m 52s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 21m 52s | | the patch passed | | +1 :green_heart: | blanks | 0m 1s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 28s | [/results-checkstyle-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4647/1/artifact/out/results-checkstyle-hadoop-common-project_hadoop-common.txt) | hadoop-common-project/hadoop-common: The patch generated 2 new + 0 unchanged - 0 fixed = 2 total (was 0) | | +1 :green_heart: | mvnsite | 1m 55s | | the patch passed | | +1 :green_heart: | javadoc | 1m 22s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 7s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 0s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 57s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 29s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 17s | | The patch does not generate ASF License warnings. | | | | 229m 26s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4647/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4647 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 63e3e48497f0 4.15.0-166-generic #174-Ubuntu SMP Wed Dec 8 19:07:44 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 6ca38f8db737dc8bc0927c4d1fd6b536493751b1 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results |
[GitHub] [hadoop] hadoop-yetus commented on pull request #4647: HADOOP-18355. Update previous index properly while validating overlapping ranges.
hadoop-yetus commented on PR #4647: URL: https://github.com/apache/hadoop/pull/4647#issuecomment-1197545407 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 12s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 41m 22s | | trunk passed | | +1 :green_heart: | compile | 25m 23s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 21m 57s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 30s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 59s | | trunk passed | | +1 :green_heart: | javadoc | 1m 33s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 6s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 7s | | trunk passed | | +1 :green_heart: | shadedclient | 26m 17s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 6s | | the patch passed | | +1 :green_heart: | compile | 24m 29s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 24m 29s | | the patch passed | | +1 :green_heart: | compile | 21m 52s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 21m 52s | | the patch passed | | +1 :green_heart: | blanks | 0m 1s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 28s | [/results-checkstyle-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4647/1/artifact/out/results-checkstyle-hadoop-common-project_hadoop-common.txt) | hadoop-common-project/hadoop-common: The patch generated 2 new + 0 unchanged - 0 fixed = 2 total (was 0) | | +1 :green_heart: | mvnsite | 1m 55s | | the patch passed | | +1 :green_heart: | javadoc | 1m 22s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 7s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 0s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 57s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 29s | | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 1m 17s | | The patch does not generate ASF License warnings. | | | | 229m 26s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4647/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4647 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 63e3e48497f0 4.15.0-166-generic #174-Ubuntu SMP Wed Dec 8 19:07:44 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 6ca38f8db737dc8bc0927c4d1fd6b536493751b1 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4647/1/testReport/ | | Max. process+thread count | 2333 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4647/1/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This
[GitHub] [hadoop] slfan1989 commented on pull request #4632: YARN-5871. [RESERVATION] Add support for reservation-based routing.
slfan1989 commented on PR #4632: URL: https://github.com/apache/hadoop/pull/4632#issuecomment-1197527754 > @slfan1989 this is a lot to review. Can we split it? Probably cleanup of the existing code and then one or two for the federation part. @goiri Thank you very much for your help reviewing the code, I will split it. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ZanderXu commented on pull request #4561: HDFS-16660. Improve Code With Lambda in IPCLoggerChannel class
ZanderXu commented on PR #4561: URL: https://github.com/apache/hadoop/pull/4561#issuecomment-1197526360 @goiri Hi, master, can you help me merge it into the trunk? I will move to improve codes with Lambda in the whole hadoop-hdfs module. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ZanderXu commented on pull request #4565: HDFS-16661. Improve Code With Lambda in AsyncLoggerSet class
ZanderXu commented on PR #4565: URL: https://github.com/apache/hadoop/pull/4565#issuecomment-1197526280 @goiri Hi, master, can you help me merge it into the trunk? I will move to improve codes with Lambda in the whole hadoop-hdfs module. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ZanderXu commented on pull request #4606: HDFS-16678. RBF should supports disable getNodeUsage() in RBFMetrics
ZanderXu commented on PR #4606: URL: https://github.com/apache/hadoop/pull/4606#issuecomment-1197525467 @goiri Hi, master, can you help me merge it into the trunk? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18227) Add input stream IOstats for vectored IO api in S3A.
[ https://issues.apache.org/jira/browse/HADOOP-18227?focusedWorklogId=795856=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795856 ] ASF GitHub Bot logged work on HADOOP-18227: --- Author: ASF GitHub Bot Created on: 28/Jul/22 00:19 Start Date: 28/Jul/22 00:19 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4636: URL: https://github.com/apache/hadoop/pull/4636#issuecomment-1197511877 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 43s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 5 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 12s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 26s | | trunk passed | | +1 :green_heart: | compile | 23m 6s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 20m 41s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 4m 20s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 45s | | trunk passed | | +1 :green_heart: | javadoc | 3m 2s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 43s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 5m 4s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 29s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 34s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 46s | | the patch passed | | +1 :green_heart: | compile | 22m 17s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 22m 17s | | the patch passed | | +1 :green_heart: | compile | 20m 43s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 43s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 17s | | the patch passed | | +1 :green_heart: | mvnsite | 3m 43s | | the patch passed | | +1 :green_heart: | javadoc | 2m 51s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 43s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 5m 13s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 23s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 47s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 3m 26s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 38s | | The patch does not generate ASF License warnings. | | | | 242m 10s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4636/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4636 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux fe0f4d6db424 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / f5a69a381d99ac1ca61a609a65544ed58e7d0a67 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4636/2/testReport/ | | Max.
[GitHub] [hadoop] hadoop-yetus commented on pull request #4636: HADOOP-18227. Add input stream IOStats for vectored IO api in S3A.
hadoop-yetus commented on PR #4636: URL: https://github.com/apache/hadoop/pull/4636#issuecomment-1197511877 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 43s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 5 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 12s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 26s | | trunk passed | | +1 :green_heart: | compile | 23m 6s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 20m 41s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 4m 20s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 45s | | trunk passed | | +1 :green_heart: | javadoc | 3m 2s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 43s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 5m 4s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 29s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 34s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 46s | | the patch passed | | +1 :green_heart: | compile | 22m 17s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 22m 17s | | the patch passed | | +1 :green_heart: | compile | 20m 43s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 43s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 4m 17s | | the patch passed | | +1 :green_heart: | mvnsite | 3m 43s | | the patch passed | | +1 :green_heart: | javadoc | 2m 51s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 43s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 5m 13s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 23s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 18m 47s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 3m 26s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 38s | | The patch does not generate ASF License warnings. | | | | 242m 10s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4636/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4636 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux fe0f4d6db424 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / f5a69a381d99ac1ca61a609a65544ed58e7d0a67 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4636/2/testReport/ | | Max. process+thread count | 1259 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4636/2/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This
[GitHub] [hadoop] jojochuang commented on a diff in pull request #4406: HDFS-16619. Fix HttpHeaders.Values And HttpHeaders.Names Deprecated Import
jojochuang commented on code in PR #4406: URL: https://github.com/apache/hadoop/pull/4406#discussion_r931673249 ## hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/server/datanode/web/TestDatanodeHttpXFrame.java: ## @@ -69,7 +68,7 @@ public void testNameNodeXFrameOptionsDisabled() throws Exception { cluster = createCluster(xFrameEnabled, null); HttpURLConnection conn = getConn(cluster); String xfoHeader = conn.getHeaderField("X-FRAME-OPTIONS"); -Assert.assertTrue("unexpected X-FRAME-OPTION in header", xfoHeader == null); +Assert.assertNull(xfoHeader); Review Comment: ```suggestion Assert.assertNull("unexpected X-FRAME-OPTION in header", xfoHeader); ``` ## hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/server/datanode/web/webhdfs/TestDataNodeUGIProvider.java: ## @@ -246,9 +246,9 @@ private WebHdfsFileSystem getWebHdfsFileSystem(UserGroupInformation ugi, DelegationTokenSecretManager dtSecretManager = new DelegationTokenSecretManager( 8640, 8640, 8640, 8640, namesystem); dtSecretManager.startThreads(); - Token token1 = new Token( + Token token1 = new Token<>( Review Comment: Unrelated change. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] goiri commented on pull request #4632: YARN-5871. [RESERVATION] Add support for reservation-based routing.
goiri commented on PR #4632: URL: https://github.com/apache/hadoop/pull/4632#issuecomment-1197452003 @slfan1989 this is a lot to review. Can we split it? Probably cleanup of the existing code and then one or two for the federation part. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] goiri commented on a diff in pull request #4632: YARN-5871. [RESERVATION] Add support for reservation-based routing.
goiri commented on code in PR #4632: URL: https://github.com/apache/hadoop/pull/4632#discussion_r931635892 ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common/src/main/java/org/apache/hadoop/yarn/server/federation/policies/router/HashBasedRouterPolicy.java: ## @@ -50,53 +48,12 @@ public void reinitialize( setPolicyContext(federationPolicyContext); } - /** - * Simply picks from alphabetically-sorted active subclusters based on the - * hash of quey name. Jobs of the same queue will all be routed to the same - * sub-cluster, as far as the number of active sub-cluster and their names - * remain the same. - * - * @param appSubmissionContext the {@link ApplicationSubmissionContext} that - * has to be routed to an appropriate subCluster for execution. - * - * @param blackListSubClusters the list of subClusters as identified by - * {@link SubClusterId} to blackList from the selection of the home - * subCluster. - * - * @return a hash-based chosen {@link SubClusterId} that will be the "home" - * for this application. - * - * @throws YarnException if there are no active subclusters. - */ @Override - public SubClusterId getHomeSubcluster( - ApplicationSubmissionContext appSubmissionContext, - List blackListSubClusters) throws YarnException { - -// throws if no active subclusters available -Map activeSubclusters = -getActiveSubclusters(); - -FederationPolicyUtils.validateSubClusterAvailability( -new ArrayList(activeSubclusters.keySet()), -blackListSubClusters); - -if (blackListSubClusters != null) { - - // Remove from the active SubClusters from StateStore the blacklisted ones - for (SubClusterId scId : blackListSubClusters) { -activeSubclusters.remove(scId); - } -} - -validate(appSubmissionContext); - -int chosenPosition = Math.abs( -appSubmissionContext.getQueue().hashCode() % activeSubclusters.size()); - -List list = new ArrayList<>(activeSubclusters.keySet()); + protected SubClusterId chooseSubCluster(String queue, Review Comment: Can we have a javadoc in the parent? ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common/src/main/java/org/apache/hadoop/yarn/server/federation/policies/router/WeightedRandomRouterPolicy.java: ## @@ -37,34 +35,19 @@ public class WeightedRandomRouterPolicy extends AbstractRouterPolicy { @Override - public SubClusterId getHomeSubcluster( - ApplicationSubmissionContext appSubmissionContext, - List blacklist) throws YarnException { - -// null checks and default-queue behavior -validate(appSubmissionContext); - -Map activeSubclusters = -getActiveSubclusters(); - -FederationPolicyUtils.validateSubClusterAvailability( -new ArrayList(activeSubclusters.keySet()), blacklist); - + protected SubClusterId chooseSubCluster( + String queue, Map preSelectSubClusters) throws YarnException { // note: we cannot pre-compute the weights, as the set of activeSubcluster // changes dynamically (and this would unfairly spread the load to -// sub-clusters adjacent to an inactive one), hence we need to count/scan +// sub-clusters adja cent to an inactive one), hence we need to count/scan // the list and based on weight pick the next sub-cluster. Map weights = getPolicyInfo().getRouterPolicyWeights(); ArrayList weightList = new ArrayList<>(); ArrayList scIdList = new ArrayList<>(); for (Map.Entry entry : weights.entrySet()) { - if (blacklist != null && blacklist.contains(entry.getKey().toId())) { -continue; - } - if (entry.getKey() != null - && activeSubclusters.containsKey(entry.getKey().toId())) { + if (entry.getKey() != null && preSelectSubClusters.containsKey(entry.getKey().toId())) { Review Comment: getValue() possibly too. ## hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common/src/main/java/org/apache/hadoop/yarn/server/federation/policies/router/LoadBasedRouterPolicy.java: ## @@ -64,29 +60,24 @@ public void reinitialize(FederationPolicyInitializationContext policyContext) } } - @Override - public SubClusterId getHomeSubcluster( - ApplicationSubmissionContext appSubmissionContext, - List blacklist) throws YarnException { - -// null checks and default-queue behavior -validate(appSubmissionContext); - -Map activeSubclusters = -getActiveSubclusters(); - -FederationPolicyUtils.validateSubClusterAvailability( -new ArrayList(activeSubclusters.keySet()), blacklist); + private long getAvailableMemory(SubClusterInfo value) throws YarnException { +try { + long mem = -1; + JSONObject obj = new JSONObject(value.getCapability()); + mem =
[GitHub] [hadoop] hadoop-yetus commented on pull request #4311: HDFS-13522: IPC changes to support observer reads through routers.
hadoop-yetus commented on PR #4311: URL: https://github.com/apache/hadoop/pull/4311#issuecomment-1197410192 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 0s | | Docker mode activated. | | -1 :x: | patch | 0m 46s | | https://github.com/apache/hadoop/pull/4311 does not apply to trunk. Rebase required? Wrong Branch? See https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute for help. | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/4311 | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4311/17/console | | versions | git=2.17.1 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4644: HDFS-16698. Add a metric to sense possible MaxDirectoryItemsExceededException in time.
hadoop-yetus commented on PR #4644: URL: https://github.com/apache/hadoop/pull/4644#issuecomment-1197398610 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 50s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 7s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 26m 21s | | trunk passed | | +1 :green_heart: | compile | 6m 40s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 6m 10s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 38s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 52s | | trunk passed | | +1 :green_heart: | javadoc | 2m 25s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 3m 7s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 5m 10s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 5s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 34s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 9s | | the patch passed | | +1 :green_heart: | compile | 5m 56s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 5m 56s | | the patch passed | | +1 :green_heart: | compile | 5m 39s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 5m 39s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 18s | [/results-checkstyle-hadoop-hdfs-project.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4644/1/artifact/out/results-checkstyle-hadoop-hdfs-project.txt) | hadoop-hdfs-project: The patch generated 1 new + 207 unchanged - 0 fixed = 208 total (was 207) | | +1 :green_heart: | mvnsite | 2m 15s | | the patch passed | | +1 :green_heart: | javadoc | 1m 43s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 36s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 4m 51s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 32s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 238m 21s | | hadoop-hdfs in the patch passed. | | +1 :green_heart: | unit | 22m 38s | | hadoop-hdfs-rbf in the patch passed. | | +1 :green_heart: | asflicense | 1m 19s | | The patch does not generate ASF License warnings. | | | | 405m 25s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4644/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4644 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets | | uname | Linux 3dd0bd8d9cab 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / a2fef1296fa912321d9bebef1d013564fe4aac73 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4644/1/testReport/ | | Max. process+thread count | 3046 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs hadoop-hdfs-project/hadoop-hdfs-rbf U: hadoop-hdfs-project | | Console output
[jira] [Updated] (HADOOP-18355) Update previous index properly while validating overlapping ranges.
[ https://issues.apache.org/jira/browse/HADOOP-18355?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HADOOP-18355: Labels: pull-request-available (was: ) > Update previous index properly while validating overlapping ranges. > > > Key: HADOOP-18355 > URL: https://issues.apache.org/jira/browse/HADOOP-18355 > Project: Hadoop Common > Issue Type: Sub-task > Components: common, fs/s3 >Reporter: Mukund Thakur >Assignee: Mukund Thakur >Priority: Major > Labels: pull-request-available > Time Spent: 10m > Remaining Estimate: 0h > > [https://github.com/apache/hadoop/blob/a55ace7bc0c173f609b51e46cb0d4d8bcda3d79d/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/VectoredReadUtils.java#L201] -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18355) Update previous index properly while validating overlapping ranges.
[ https://issues.apache.org/jira/browse/HADOOP-18355?focusedWorklogId=795830=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795830 ] ASF GitHub Bot logged work on HADOOP-18355: --- Author: ASF GitHub Bot Created on: 27/Jul/22 21:32 Start Date: 27/Jul/22 21:32 Worklog Time Spent: 10m Work Description: mukund-thakur opened a new pull request, #4647: URL: https://github.com/apache/hadoop/pull/4647 part of HADOOP-18103. ### Description of PR ### How was this patch tested? Added a new UT and re-ran the vectored io related test suites. ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? Issue Time Tracking --- Worklog Id: (was: 795830) Remaining Estimate: 0h Time Spent: 10m > Update previous index properly while validating overlapping ranges. > > > Key: HADOOP-18355 > URL: https://issues.apache.org/jira/browse/HADOOP-18355 > Project: Hadoop Common > Issue Type: Sub-task > Components: common, fs/s3 >Reporter: Mukund Thakur >Assignee: Mukund Thakur >Priority: Major > Time Spent: 10m > Remaining Estimate: 0h > > [https://github.com/apache/hadoop/blob/a55ace7bc0c173f609b51e46cb0d4d8bcda3d79d/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/VectoredReadUtils.java#L201] -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mukund-thakur opened a new pull request, #4647: HADOOP-18355. Update previous index properly while validating overlapping ranges.
mukund-thakur opened a new pull request, #4647: URL: https://github.com/apache/hadoop/pull/4647 part of HADOOP-18103. ### Description of PR ### How was this patch tested? Added a new UT and re-ran the vectored io related test suites. ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18357) Retarget solution file to VS2019
[ https://issues.apache.org/jira/browse/HADOOP-18357?focusedWorklogId=795823=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795823 ] ASF GitHub Bot logged work on HADOOP-18357: --- Author: ASF GitHub Bot Created on: 27/Jul/22 20:25 Start Date: 27/Jul/22 20:25 Worklog Time Spent: 10m Work Description: goiri commented on code in PR #4616: URL: https://github.com/apache/hadoop/pull/4616#discussion_r931549627 ## hadoop-common-project/hadoop-common/src/main/native/native.vcxproj: ## @@ -39,12 +37,14 @@ false true Unicode +v142 Review Comment: Take a look at HADOOP-14667 Specially: `MSBuild Solution files are converted to the version of VS at build time` If this makes sense, we go with this. Issue Time Tracking --- Worklog Id: (was: 795823) Time Spent: 2.5h (was: 2h 20m) > Retarget solution file to VS2019 > > > Key: HADOOP-18357 > URL: https://issues.apache.org/jira/browse/HADOOP-18357 > Project: Hadoop Common > Issue Type: Bug > Components: common >Affects Versions: 3.4.0 > Environment: Windows 10 >Reporter: Gautham Banasandra >Assignee: Gautham Banasandra >Priority: Major > Labels: libhdfscpp, pull-request-available > Time Spent: 2.5h > Remaining Estimate: 0h > > The Visual Studio version used by winutils and native components in Hadoop > common are quite old. We need to retarget the solution and vcxproj files to > use the latest version (Visual Studio 2019 as of this writing). -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] goiri commented on a diff in pull request #4616: HADOOP-18357. Retarget solution file to VS2019
goiri commented on code in PR #4616: URL: https://github.com/apache/hadoop/pull/4616#discussion_r931549627 ## hadoop-common-project/hadoop-common/src/main/native/native.vcxproj: ## @@ -39,12 +37,14 @@ false true Unicode +v142 Review Comment: Take a look at HADOOP-14667 Specially: `MSBuild Solution files are converted to the version of VS at build time` If this makes sense, we go with this. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18357) Retarget solution file to VS2019
[ https://issues.apache.org/jira/browse/HADOOP-18357?focusedWorklogId=795822=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795822 ] ASF GitHub Bot logged work on HADOOP-18357: --- Author: ASF GitHub Bot Created on: 27/Jul/22 20:24 Start Date: 27/Jul/22 20:24 Worklog Time Spent: 10m Work Description: goiri commented on code in PR #4616: URL: https://github.com/apache/hadoop/pull/4616#discussion_r931549627 ## hadoop-common-project/hadoop-common/src/main/native/native.vcxproj: ## @@ -39,12 +37,14 @@ false true Unicode +v142 Review Comment: Take a look at HADOOP-14667 If this makes sense, we go with this. Issue Time Tracking --- Worklog Id: (was: 795822) Time Spent: 2h 20m (was: 2h 10m) > Retarget solution file to VS2019 > > > Key: HADOOP-18357 > URL: https://issues.apache.org/jira/browse/HADOOP-18357 > Project: Hadoop Common > Issue Type: Bug > Components: common >Affects Versions: 3.4.0 > Environment: Windows 10 >Reporter: Gautham Banasandra >Assignee: Gautham Banasandra >Priority: Major > Labels: libhdfscpp, pull-request-available > Time Spent: 2h 20m > Remaining Estimate: 0h > > The Visual Studio version used by winutils and native components in Hadoop > common are quite old. We need to retarget the solution and vcxproj files to > use the latest version (Visual Studio 2019 as of this writing). -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] goiri commented on a diff in pull request #4616: HADOOP-18357. Retarget solution file to VS2019
goiri commented on code in PR #4616: URL: https://github.com/apache/hadoop/pull/4616#discussion_r931549627 ## hadoop-common-project/hadoop-common/src/main/native/native.vcxproj: ## @@ -39,12 +37,14 @@ false true Unicode +v142 Review Comment: Take a look at HADOOP-14667 If this makes sense, we go with this. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] Samrat002 commented on pull request #4587: YARN-11200 numa support in branch-2.10
Samrat002 commented on PR #4587: URL: https://github.com/apache/hadoop/pull/4587#issuecomment-1197291860 @PrabhuJoseph please review ! checkstyle warnings are fixed -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18344) AWS SDK update to 1.12.262 to address jackson CVE-2018-7489
[ https://issues.apache.org/jira/browse/HADOOP-18344?focusedWorklogId=795806=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795806 ] ASF GitHub Bot logged work on HADOOP-18344: --- Author: ASF GitHub Bot Created on: 27/Jul/22 19:13 Start Date: 27/Jul/22 19:13 Worklog Time Spent: 10m Work Description: mukund-thakur commented on code in PR #4637: URL: https://github.com/apache/hadoop/pull/4637#discussion_r931496072 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/testing.md: ## @@ -1274,6 +1282,21 @@ bin/hadoop s3guard markers -clean -verbose $BUCKET # expect success and exit code of 0 bin/hadoop s3guard markers -audit -verbose $BUCKET +# --- +# Copy to from local +# --- + +time bin/hadoop fs -copyFromLocal -t 10 share/hadoop/tools/lib/*aws*jar $BUCKET/ + +# expect the iostatistics object_list_request value to be O(directories) +bin/hadoop fs -ls -R $BUCKET/ + +# expect the iostatistics object_list_request and op_get_content_summary values to be 1 Review Comment: Yes there is a extra space here at the end as pointed by Yetus, Issue Time Tracking --- Worklog Id: (was: 795806) Time Spent: 3h 10m (was: 3h) > AWS SDK update to 1.12.262 to address jackson CVE-2018-7489 > > > Key: HADOOP-18344 > URL: https://issues.apache.org/jira/browse/HADOOP-18344 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.4.0, 3.3.4 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 3h 10m > Remaining Estimate: 0h > > yet another jackson CVE in aws sdk > https://github.com/apache/hadoop/pull/4491/commits/5496816b472473eb7a9c174b7d3e69b6eee1e271 > maybe we need to have a list of all shaded jackson's we get on the CP and > have a process of upgrading them all at the same time -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mukund-thakur commented on a diff in pull request #4637: HADOOP-18344. Upgrade AWS SDK to 1.12.262
mukund-thakur commented on code in PR #4637: URL: https://github.com/apache/hadoop/pull/4637#discussion_r931496072 ## hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/testing.md: ## @@ -1274,6 +1282,21 @@ bin/hadoop s3guard markers -clean -verbose $BUCKET # expect success and exit code of 0 bin/hadoop s3guard markers -audit -verbose $BUCKET +# --- +# Copy to from local +# --- + +time bin/hadoop fs -copyFromLocal -t 10 share/hadoop/tools/lib/*aws*jar $BUCKET/ + +# expect the iostatistics object_list_request value to be O(directories) +bin/hadoop fs -ls -R $BUCKET/ + +# expect the iostatistics object_list_request and op_get_content_summary values to be 1 Review Comment: Yes there is a extra space here at the end as pointed by Yetus, -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4587: YARN-11200 numa support in branch-2.10
hadoop-yetus commented on PR #4587: URL: https://github.com/apache/hadoop/pull/4587#issuecomment-1197251385 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 44s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ branch-2.10 Compile Tests _ | | +0 :ok: | mvndep | 3m 30s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 14m 4s | | branch-2.10 passed | | +1 :green_heart: | compile | 7m 41s | | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | compile | 7m 0s | | branch-2.10 passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | checkstyle | 1m 40s | | branch-2.10 passed | | +1 :green_heart: | mvnsite | 3m 32s | | branch-2.10 passed | | +1 :green_heart: | javadoc | 3m 27s | | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 3m 7s | | branch-2.10 passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | spotbugs | 6m 16s | | branch-2.10 passed | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 26s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 56s | | the patch passed | | +1 :green_heart: | compile | 6m 55s | | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javac | 6m 55s | | the patch passed | | +1 :green_heart: | compile | 6m 59s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | javac | 6m 59s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 33s | | the patch passed | | +1 :green_heart: | mvnsite | 3m 20s | | the patch passed | | +1 :green_heart: | javadoc | 3m 6s | | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 2m 54s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | spotbugs | 6m 16s | | the patch passed | _ Other Tests _ | | +1 :green_heart: | unit | 1m 12s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 3m 51s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 16m 6s | | hadoop-yarn-server-nodemanager in the patch passed. | | +1 :green_heart: | asflicense | 1m 8s | | The patch does not generate ASF License warnings. | | | | 115m 16s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4587/9/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4587 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux e35044e8cdac 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-2.10 / c51ce6d41e660b9dfda0a31ff470accaa27b115b | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Multi-JDK versions | /usr/lib/jvm/zulu-7-amd64:Azul Systems, Inc.-1.7.0_262-b10 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4587/9/testReport/ | | Max. process+thread count | 185 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager U: hadoop-yarn-project/hadoop-yarn | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4587/9/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
[GitHub] [hadoop] hadoop-yetus commented on pull request #4587: YARN-11200 numa support in branch-2.10
hadoop-yetus commented on PR #4587: URL: https://github.com/apache/hadoop/pull/4587#issuecomment-1197251327 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 9s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ branch-2.10 Compile Tests _ | | +0 :ok: | mvndep | 3m 27s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 14m 1s | | branch-2.10 passed | | +1 :green_heart: | compile | 7m 41s | | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | compile | 6m 55s | | branch-2.10 passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | checkstyle | 1m 34s | | branch-2.10 passed | | +1 :green_heart: | mvnsite | 3m 27s | | branch-2.10 passed | | +1 :green_heart: | javadoc | 3m 20s | | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 3m 8s | | branch-2.10 passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | spotbugs | 6m 10s | | branch-2.10 passed | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 25s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 52s | | the patch passed | | +1 :green_heart: | compile | 6m 48s | | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javac | 6m 48s | | the patch passed | | +1 :green_heart: | compile | 7m 0s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | javac | 7m 0s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 31s | | the patch passed | | +1 :green_heart: | mvnsite | 3m 20s | | the patch passed | | +1 :green_heart: | javadoc | 3m 8s | | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 2m 55s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | spotbugs | 6m 15s | | the patch passed | _ Other Tests _ | | +1 :green_heart: | unit | 1m 12s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 3m 50s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 16m 6s | | hadoop-yarn-server-nodemanager in the patch passed. | | +1 :green_heart: | asflicense | 1m 7s | | The patch does not generate ASF License warnings. | | | | 113m 22s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4587/10/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4587 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux 65b77e696592 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-2.10 / c51ce6d41e660b9dfda0a31ff470accaa27b115b | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Multi-JDK versions | /usr/lib/jvm/zulu-7-amd64:Azul Systems, Inc.-1.7.0_262-b10 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4587/10/testReport/ | | Max. process+thread count | 179 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-api hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager U: hadoop-yarn-project/hadoop-yarn | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4587/10/console | | versions | git=2.17.1 maven=3.6.0 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
[jira] [Work logged] (HADOOP-18344) AWS SDK update to 1.12.262 to address jackson CVE-2018-7489
[ https://issues.apache.org/jira/browse/HADOOP-18344?focusedWorklogId=795803=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795803 ] ASF GitHub Bot logged work on HADOOP-18344: --- Author: ASF GitHub Bot Created on: 27/Jul/22 19:02 Start Date: 27/Jul/22 19:02 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4646: URL: https://github.com/apache/hadoop/pull/4646#issuecomment-1197247524 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 11m 52s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ branch-3.3.4 Compile Tests _ | | +0 :ok: | mvndep | 4m 59s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 35m 6s | | branch-3.3.4 passed | | +1 :green_heart: | compile | 19m 52s | | branch-3.3.4 passed | | +1 :green_heart: | mvnsite | 2m 7s | | branch-3.3.4 passed | | +1 :green_heart: | javadoc | 2m 8s | | branch-3.3.4 passed | | +1 :green_heart: | shadedclient | 90m 54s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 47s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 2s | | the patch passed | | +1 :green_heart: | compile | 18m 56s | | the patch passed | | +1 :green_heart: | javac | 18m 56s | | the patch passed | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4646/1/artifact/out/blanks-eol.txt) | The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | mvnsite | 2m 2s | | the patch passed | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 1m 51s | | the patch passed | | +1 :green_heart: | shadedclient | 28m 55s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 49s | | hadoop-project in the patch passed. | | +1 :green_heart: | unit | 2m 44s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 17s | | The patch does not generate ASF License warnings. | | | | 162m 42s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4646/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4646 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell xml markdownlint | | uname | Linux c59ed72f9b24 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3.4 / a5e73f562b70d2f292c415d852a6668060a72151 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4646/1/testReport/ | | Max. process+thread count | 515 (vs. ulimit of 5500) | | modules | C: hadoop-project hadoop-tools/hadoop-aws U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4646/1/console | | versions | git=2.17.1 maven=3.6.0 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. Issue Time Tracking --- Worklog Id: (was: 795803) Time Spent: 3h (was: 2h 50m) > AWS SDK update to 1.12.262 to address jackson CVE-2018-7489 > > > Key: HADOOP-18344 > URL: https://issues.apache.org/jira/browse/HADOOP-18344 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.4.0, 3.3.4 >
[GitHub] [hadoop] hadoop-yetus commented on pull request #4646: HADOOP-18344. Upgrade AWS SDK to 1.12.262
hadoop-yetus commented on PR #4646: URL: https://github.com/apache/hadoop/pull/4646#issuecomment-1197247524 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 11m 52s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ branch-3.3.4 Compile Tests _ | | +0 :ok: | mvndep | 4m 59s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 35m 6s | | branch-3.3.4 passed | | +1 :green_heart: | compile | 19m 52s | | branch-3.3.4 passed | | +1 :green_heart: | mvnsite | 2m 7s | | branch-3.3.4 passed | | +1 :green_heart: | javadoc | 2m 8s | | branch-3.3.4 passed | | +1 :green_heart: | shadedclient | 90m 54s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 47s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 2s | | the patch passed | | +1 :green_heart: | compile | 18m 56s | | the patch passed | | +1 :green_heart: | javac | 18m 56s | | the patch passed | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4646/1/artifact/out/blanks-eol.txt) | The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | mvnsite | 2m 2s | | the patch passed | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 1m 51s | | the patch passed | | +1 :green_heart: | shadedclient | 28m 55s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 49s | | hadoop-project in the patch passed. | | +1 :green_heart: | unit | 2m 44s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 17s | | The patch does not generate ASF License warnings. | | | | 162m 42s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4646/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4646 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell xml markdownlint | | uname | Linux c59ed72f9b24 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3.4 / a5e73f562b70d2f292c415d852a6668060a72151 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4646/1/testReport/ | | Max. process+thread count | 515 (vs. ulimit of 5500) | | modules | C: hadoop-project hadoop-tools/hadoop-aws U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4646/1/console | | versions | git=2.17.1 maven=3.6.0 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18344) AWS SDK update to 1.12.262 to address jackson CVE-2018-7489
[ https://issues.apache.org/jira/browse/HADOOP-18344?focusedWorklogId=795801=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795801 ] ASF GitHub Bot logged work on HADOOP-18344: --- Author: ASF GitHub Bot Created on: 27/Jul/22 18:54 Start Date: 27/Jul/22 18:54 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4645: URL: https://github.com/apache/hadoop/pull/4645#issuecomment-1197235449 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 39s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ branch-3.3 Compile Tests _ | | +0 :ok: | mvndep | 14m 31s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 27m 34s | | branch-3.3 passed | | +1 :green_heart: | compile | 21m 2s | | branch-3.3 passed | | +1 :green_heart: | mvnsite | 2m 23s | | branch-3.3 passed | | +1 :green_heart: | javadoc | 2m 12s | | branch-3.3 passed | | +1 :green_heart: | shadedclient | 94m 50s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 49s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 8s | | the patch passed | | +1 :green_heart: | compile | 19m 39s | | the patch passed | | +1 :green_heart: | javac | 19m 39s | | the patch passed | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4645/1/artifact/out/blanks-eol.txt) | The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | mvnsite | 2m 23s | | the patch passed | | +1 :green_heart: | javadoc | 2m 2s | | the patch passed | | +1 :green_heart: | shadedclient | 33m 43s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 59s | | hadoop-project in the patch passed. | | +1 :green_heart: | unit | 3m 1s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 24s | | The patch does not generate ASF License warnings. | | | | 159m 5s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4645/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4645 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint markdownlint | | uname | Linux b2b2d27bc426 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / 1993b21a560db74204efdbf39dc1977520d2e846 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4645/1/testReport/ | | Max. process+thread count | 551 (vs. ulimit of 5500) | | modules | C: hadoop-project hadoop-tools/hadoop-aws U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4645/1/console | | versions | git=2.17.1 maven=3.6.0 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. Issue Time Tracking --- Worklog Id: (was: 795801) Time Spent: 2h 50m (was: 2h 40m) > AWS SDK update to 1.12.262 to address jackson CVE-2018-7489 > > > Key: HADOOP-18344 > URL: https://issues.apache.org/jira/browse/HADOOP-18344 > Project: Hadoop Common > Issue Type: Sub-task >
[GitHub] [hadoop] hadoop-yetus commented on pull request #4645: HADOOP-18344. Upgrade AWS SDK to 1.12.262
hadoop-yetus commented on PR #4645: URL: https://github.com/apache/hadoop/pull/4645#issuecomment-1197235449 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 39s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ branch-3.3 Compile Tests _ | | +0 :ok: | mvndep | 14m 31s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 27m 34s | | branch-3.3 passed | | +1 :green_heart: | compile | 21m 2s | | branch-3.3 passed | | +1 :green_heart: | mvnsite | 2m 23s | | branch-3.3 passed | | +1 :green_heart: | javadoc | 2m 12s | | branch-3.3 passed | | +1 :green_heart: | shadedclient | 94m 50s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 49s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 8s | | the patch passed | | +1 :green_heart: | compile | 19m 39s | | the patch passed | | +1 :green_heart: | javac | 19m 39s | | the patch passed | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4645/1/artifact/out/blanks-eol.txt) | The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | mvnsite | 2m 23s | | the patch passed | | +1 :green_heart: | javadoc | 2m 2s | | the patch passed | | +1 :green_heart: | shadedclient | 33m 43s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 0m 59s | | hadoop-project in the patch passed. | | +1 :green_heart: | unit | 3m 1s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 24s | | The patch does not generate ASF License warnings. | | | | 159m 5s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4645/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4645 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint markdownlint | | uname | Linux b2b2d27bc426 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-3.3 / 1993b21a560db74204efdbf39dc1977520d2e846 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4645/1/testReport/ | | Max. process+thread count | 551 (vs. ulimit of 5500) | | modules | C: hadoop-project hadoop-tools/hadoop-aws U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4645/1/console | | versions | git=2.17.1 maven=3.6.0 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18344) AWS SDK update to 1.12.262 to address jackson CVE-2018-7489
[ https://issues.apache.org/jira/browse/HADOOP-18344?focusedWorklogId=795797=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795797 ] ASF GitHub Bot logged work on HADOOP-18344: --- Author: ASF GitHub Bot Created on: 27/Jul/22 18:51 Start Date: 27/Jul/22 18:51 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4637: URL: https://github.com/apache/hadoop/pull/4637#issuecomment-1197231733 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 5s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 4s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 28m 2s | | trunk passed | | +1 :green_heart: | compile | 26m 24s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 21m 47s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | mvnsite | 2m 21s | | trunk passed | | +1 :green_heart: | javadoc | 1m 52s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 9s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 119m 5s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 47s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 3s | | the patch passed | | +1 :green_heart: | compile | 22m 27s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 22m 27s | | the patch passed | | +1 :green_heart: | compile | 20m 55s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 55s | | the patch passed | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4637/3/artifact/out/blanks-eol.txt) | The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | mvnsite | 2m 39s | | the patch passed | | +1 :green_heart: | javadoc | 2m 21s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 28s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 33m 5s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 1m 9s | | hadoop-project in the patch passed. | | +1 :green_heart: | unit | 3m 22s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 37s | | The patch does not generate ASF License warnings. | | | | 206m 49s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4637/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4637 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint markdownlint | | uname | Linux 0a1dfd256d92 4.15.0-65-generic #74-Ubuntu SMP Tue Sep 17 17:06:04 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 7e1b620d2fea3a4003a80cdddc5aa14f0812e388 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1
[GitHub] [hadoop] hadoop-yetus commented on pull request #4637: HADOOP-18344. Upgrade AWS SDK to 1.12.262
hadoop-yetus commented on PR #4637: URL: https://github.com/apache/hadoop/pull/4637#issuecomment-1197231733 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 5s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 0s | | xmllint was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 4s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 28m 2s | | trunk passed | | +1 :green_heart: | compile | 26m 24s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 21m 47s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | mvnsite | 2m 21s | | trunk passed | | +1 :green_heart: | javadoc | 1m 52s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 9s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 119m 5s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 47s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 3s | | the patch passed | | +1 :green_heart: | compile | 22m 27s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 22m 27s | | the patch passed | | +1 :green_heart: | compile | 20m 55s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 20m 55s | | the patch passed | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4637/3/artifact/out/blanks-eol.txt) | The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | mvnsite | 2m 39s | | the patch passed | | +1 :green_heart: | javadoc | 2m 21s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 28s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 33m 5s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 1m 9s | | hadoop-project in the patch passed. | | +1 :green_heart: | unit | 3m 22s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 37s | | The patch does not generate ASF License warnings. | | | | 206m 49s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4637/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4637 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint markdownlint | | uname | Linux 0a1dfd256d92 4.15.0-65-generic #74-Ubuntu SMP Tue Sep 17 17:06:04 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 7e1b620d2fea3a4003a80cdddc5aa14f0812e388 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4637/3/testReport/ | | Max. process+thread count | 648 (vs. ulimit of 5500) | | modules | C: hadoop-project hadoop-tools/hadoop-aws U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4637/3/console | | versions | git=2.25.1
[jira] [Work logged] (HADOOP-18345) Enhance client protocol to propagate last seen state IDs for multiple nameservices.
[ https://issues.apache.org/jira/browse/HADOOP-18345?focusedWorklogId=795794=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795794 ] ASF GitHub Bot logged work on HADOOP-18345: --- Author: ASF GitHub Bot Created on: 27/Jul/22 18:43 Start Date: 27/Jul/22 18:43 Worklog Time Spent: 10m Work Description: simbadzina commented on code in PR #4584: URL: https://github.com/apache/hadoop/pull/4584#discussion_r931470957 ## hadoop-common-project/hadoop-common/src/main/proto/RpcHeader.proto: ## @@ -157,6 +158,7 @@ message RpcResponseHeaderProto { optional bytes clientId = 7; // Globally unique client ID optional sint32 retryCount = 8 [default = -1]; optional int64 stateId = 9; // The last written Global State ID + map nameserviceStateIds = 10; // Last seen state IDs for multiple nameservices. Review Comment: Yes, it should be optional. Thanks for spotting the error. I'll fix that. I'm also going to make this a bytearray since the client doesn't need to parse it. This will give us the flexibility to evolve the contents in the routers. Issue Time Tracking --- Worklog Id: (was: 795794) Time Spent: 1h 50m (was: 1h 40m) > Enhance client protocol to propagate last seen state IDs for multiple > nameservices. > --- > > Key: HADOOP-18345 > URL: https://issues.apache.org/jira/browse/HADOOP-18345 > Project: Hadoop Common > Issue Type: New Feature >Reporter: Simbarashe Dzinamarira >Assignee: Simbarashe Dzinamarira >Priority: Major > Labels: pull-request-available > Time Spent: 1h 50m > Remaining Estimate: 0h > > The RPCHeader in the client protocol currently contains a single value to > indicate the last seen state ID for a namenode. > {noformat} > optional int64 stateId = 8; // The last seen Global State ID > {noformat} > When there are multiple namenodes, such as in router based federation, the > headers need to carry the state IDs for each of these nameservices that are > part of the federation. > This change is a prerequisite for HDFS-13522: RBF: Support observer node from > Router-Based Federation -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-15066) Spurious error stopping secure datanode
[ https://issues.apache.org/jira/browse/HADOOP-15066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17572083#comment-17572083 ] Sergey Shevchenko commented on HADOOP-15066: For some reason this fix for hadoop-functions.sh is missing from apache-hadoop-3.3.3 > Spurious error stopping secure datanode > --- > > Key: HADOOP-15066 > URL: https://issues.apache.org/jira/browse/HADOOP-15066 > Project: Hadoop Common > Issue Type: Bug > Components: scripts >Affects Versions: 3.0.0 >Reporter: Arpit Agarwal >Assignee: Bharat Viswanadham >Priority: Major > Attachments: HADOOP-15066.00.patch, HADOOP-15066.01.patch > > > There is a spurious error when stopping a secure datanode. > {code} > # hdfs --daemon stop datanode > cat: /var/run/hadoop/hdfs//hadoop-hdfs-root-datanode.pid: No such file or > directory > WARNING: pid has changed for datanode, skip deleting pid file > cat: /var/run/hadoop/hdfs//hadoop-hdfs-root-datanode.pid: No such file or > directory > WARNING: daemon pid has changed for datanode, skip deleting daemon pid file > {code} > The error appears benign. The service was stopped correctly. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] simbadzina commented on a diff in pull request #4584: HADOOP-18345: Enhance client protocol to propagate last seen state IDs for multiple nameservices.
simbadzina commented on code in PR #4584: URL: https://github.com/apache/hadoop/pull/4584#discussion_r931470957 ## hadoop-common-project/hadoop-common/src/main/proto/RpcHeader.proto: ## @@ -157,6 +158,7 @@ message RpcResponseHeaderProto { optional bytes clientId = 7; // Globally unique client ID optional sint32 retryCount = 8 [default = -1]; optional int64 stateId = 9; // The last written Global State ID + map nameserviceStateIds = 10; // Last seen state IDs for multiple nameservices. Review Comment: Yes, it should be optional. Thanks for spotting the error. I'll fix that. I'm also going to make this a bytearray since the client doesn't need to parse it. This will give us the flexibility to evolve the contents in the routers. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Resolved] (HADOOP-18079) Upgrade Netty to 4.1.77.Final
[ https://issues.apache.org/jira/browse/HADOOP-18079?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Wei-Chiu Chuang resolved HADOOP-18079. -- Resolution: Fixed > Upgrade Netty to 4.1.77.Final > - > > Key: HADOOP-18079 > URL: https://issues.apache.org/jira/browse/HADOOP-18079 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.3.3 >Reporter: Renukaprasad C >Assignee: Wei-Chiu Chuang >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.4, 3.2.5 > > Time Spent: 5h 20m > Remaining Estimate: 0h > > h4. Netty version - 4.1.71 has fix some CVEs. > CVE-2019-20444, > CVE-2019-20445 > CVE-2022-24823 > Upgrade to latest version. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4632: YARN-5871. [RESERVATION] Add support for reservation-based routing.
hadoop-yetus commented on PR #4632: URL: https://github.com/apache/hadoop/pull/4632#issuecomment-1197186289 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 10s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | buf | 0m 1s | | buf was not available. | | +0 :ok: | buf | 0m 1s | | buf was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 10 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 59s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 31m 12s | | trunk passed | | +1 :green_heart: | compile | 4m 18s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 3m 32s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 30s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 4s | | trunk passed | | +1 :green_heart: | javadoc | 1m 53s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 37s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 52s | | trunk passed | | +1 :green_heart: | shadedclient | 24m 58s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 28s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 34s | | the patch passed | | +1 :green_heart: | compile | 4m 0s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | cc | 4m 0s | | the patch passed | | -1 :x: | javac | 4m 0s | [/results-compile-javac-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4632/6/artifact/out/results-compile-javac-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 3 new + 447 unchanged - 0 fixed = 450 total (was 447) | | +1 :green_heart: | compile | 3m 18s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | cc | 3m 18s | | the patch passed | | -1 :x: | javac | 3m 18s | [/results-compile-javac-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4632/6/artifact/out/results-compile-javac-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 generated 3 new + 371 unchanged - 0 fixed = 374 total (was 371) | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 15s | [/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4632/6/artifact/out/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server.txt) | hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server: The patch generated 18 new + 12 unchanged - 1 fixed = 30 total (was 13) | | +1 :green_heart: | mvnsite | 1m 42s | | the patch passed | | +1 :green_heart: | javadoc | 1m 24s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 19s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 43s | | the patch passed | | +1 :green_heart: | shadedclient | 24m 35s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | -1 :x: | unit | 2m 52s |
[jira] [Commented] (HADOOP-18079) Upgrade Netty to 4.1.77.Final
[ https://issues.apache.org/jira/browse/HADOOP-18079?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17572078#comment-17572078 ] Wei-Chiu Chuang commented on HADOOP-18079: -- Added into 3.3.4. > Upgrade Netty to 4.1.77.Final > - > > Key: HADOOP-18079 > URL: https://issues.apache.org/jira/browse/HADOOP-18079 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.3.3 >Reporter: Renukaprasad C >Assignee: Wei-Chiu Chuang >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.4, 3.2.5 > > Time Spent: 5h 20m > Remaining Estimate: 0h > > h4. Netty version - 4.1.71 has fix some CVEs. > CVE-2019-20444, > CVE-2019-20445 > CVE-2022-24823 > Upgrade to latest version. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-18079) Upgrade Netty to 4.1.77.Final
[ https://issues.apache.org/jira/browse/HADOOP-18079?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Wei-Chiu Chuang updated HADOOP-18079: - Fix Version/s: 3.3.4 (was: 3.3.9) > Upgrade Netty to 4.1.77.Final > - > > Key: HADOOP-18079 > URL: https://issues.apache.org/jira/browse/HADOOP-18079 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.3.3 >Reporter: Renukaprasad C >Assignee: Wei-Chiu Chuang >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.4, 3.2.5 > > Time Spent: 5h 20m > Remaining Estimate: 0h > > h4. Netty version - 4.1.71 has fix some CVEs. > CVE-2019-20444, > CVE-2019-20445 > CVE-2022-24823 > Upgrade to latest version. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18344) AWS SDK update to 1.12.262 to address jackson CVE-2018-7489
[ https://issues.apache.org/jira/browse/HADOOP-18344?focusedWorklogId=795785=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795785 ] ASF GitHub Bot logged work on HADOOP-18344: --- Author: ASF GitHub Bot Created on: 27/Jul/22 18:14 Start Date: 27/Jul/22 18:14 Worklog Time Spent: 10m Work Description: mukund-thakur commented on PR #4637: URL: https://github.com/apache/hadoop/pull/4637#issuecomment-1197127350 Okay looks good. running the tests once complete will give a +1 Issue Time Tracking --- Worklog Id: (was: 795785) Time Spent: 2.5h (was: 2h 20m) > AWS SDK update to 1.12.262 to address jackson CVE-2018-7489 > > > Key: HADOOP-18344 > URL: https://issues.apache.org/jira/browse/HADOOP-18344 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.4.0, 3.3.4 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 2.5h > Remaining Estimate: 0h > > yet another jackson CVE in aws sdk > https://github.com/apache/hadoop/pull/4491/commits/5496816b472473eb7a9c174b7d3e69b6eee1e271 > maybe we need to have a list of all shaded jackson's we get on the CP and > have a process of upgrading them all at the same time -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mukund-thakur commented on pull request #4637: HADOOP-18344. Upgrade AWS SDK to 1.12.262
mukund-thakur commented on PR #4637: URL: https://github.com/apache/hadoop/pull/4637#issuecomment-1197127350 Okay looks good. running the tests once complete will give a +1 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18079) Upgrade Netty to 4.1.77.Final
[ https://issues.apache.org/jira/browse/HADOOP-18079?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17572053#comment-17572053 ] Wei-Chiu Chuang commented on HADOOP-18079: -- Sure. Patch applies cleanly. I'll push up after making sure the build does not break. > Upgrade Netty to 4.1.77.Final > - > > Key: HADOOP-18079 > URL: https://issues.apache.org/jira/browse/HADOOP-18079 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.3.3 >Reporter: Renukaprasad C >Assignee: Wei-Chiu Chuang >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.9, 3.2.5 > > Time Spent: 5h 20m > Remaining Estimate: 0h > > h4. Netty version - 4.1.71 has fix some CVEs. > CVE-2019-20444, > CVE-2019-20445 > CVE-2022-24823 > Upgrade to latest version. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-18344) AWS SDK update to 1.12.262 to address jackson CVE-2018-7489
[ https://issues.apache.org/jira/browse/HADOOP-18344?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17572044#comment-17572044 ] Steve Loughran commented on HADOOP-18344: - had just seen it myself. thank you for fixing in HADOOP-18372.. > AWS SDK update to 1.12.262 to address jackson CVE-2018-7489 > > > Key: HADOOP-18344 > URL: https://issues.apache.org/jira/browse/HADOOP-18344 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.4.0, 3.3.4 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 2h 20m > Remaining Estimate: 0h > > yet another jackson CVE in aws sdk > https://github.com/apache/hadoop/pull/4491/commits/5496816b472473eb7a9c174b7d3e69b6eee1e271 > maybe we need to have a list of all shaded jackson's we get on the CP and > have a process of upgrading them all at the same time -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18227) Add input stream IOstats for vectored IO api in S3A.
[ https://issues.apache.org/jira/browse/HADOOP-18227?focusedWorklogId=795770=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795770 ] ASF GitHub Bot logged work on HADOOP-18227: --- Author: ASF GitHub Bot Created on: 27/Jul/22 17:27 Start Date: 27/Jul/22 17:27 Worklog Time Spent: 10m Work Description: steveloughran commented on code in PR #4636: URL: https://github.com/apache/hadoop/pull/4636#discussion_r931322342 ## hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/contract/AbstractContractVectoredReadTest.java: ## @@ -60,9 +60,9 @@ public abstract class AbstractContractVectoredReadTest extends AbstractFSContrac protected static final byte[] DATASET = ContractTestUtils.dataset(DATASET_LEN, 'a', 32); protected static final String VECTORED_READ_FILE_NAME = "vectored_file.txt"; - private final IntFunction allocate; + protected final IntFunction allocate; Review Comment: keep private and add getters ## hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3AInstrumentation.java: ## @@ -835,6 +839,10 @@ private InputStreamStatistics( StreamStatisticNames.STREAM_READ_SEEK_BYTES_DISCARDED, StreamStatisticNames.STREAM_READ_SEEK_BYTES_SKIPPED, StreamStatisticNames.STREAM_READ_TOTAL_BYTES, + StreamStatisticNames.STREAM_READ_VECTORED_OPERATIONS, Review Comment: can you keep in alphabetical order Issue Time Tracking --- Worklog Id: (was: 795770) Time Spent: 40m (was: 0.5h) > Add input stream IOstats for vectored IO api in S3A. > > > Key: HADOOP-18227 > URL: https://issues.apache.org/jira/browse/HADOOP-18227 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Reporter: Mukund Thakur >Assignee: Mukund Thakur >Priority: Major > Labels: pull-request-available > Time Spent: 40m > Remaining Estimate: 0h > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on a diff in pull request #4636: HADOOP-18227. Add input stream IOStats for vectored IO api in S3A.
steveloughran commented on code in PR #4636: URL: https://github.com/apache/hadoop/pull/4636#discussion_r931322342 ## hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/contract/AbstractContractVectoredReadTest.java: ## @@ -60,9 +60,9 @@ public abstract class AbstractContractVectoredReadTest extends AbstractFSContrac protected static final byte[] DATASET = ContractTestUtils.dataset(DATASET_LEN, 'a', 32); protected static final String VECTORED_READ_FILE_NAME = "vectored_file.txt"; - private final IntFunction allocate; + protected final IntFunction allocate; Review Comment: keep private and add getters ## hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3AInstrumentation.java: ## @@ -835,6 +839,10 @@ private InputStreamStatistics( StreamStatisticNames.STREAM_READ_SEEK_BYTES_DISCARDED, StreamStatisticNames.STREAM_READ_SEEK_BYTES_SKIPPED, StreamStatisticNames.STREAM_READ_TOTAL_BYTES, + StreamStatisticNames.STREAM_READ_VECTORED_OPERATIONS, Review Comment: can you keep in alphabetical order -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18340) deleteOnExit does not work with S3AFileSystem
[ https://issues.apache.org/jira/browse/HADOOP-18340?focusedWorklogId=795768=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795768 ] ASF GitHub Bot logged work on HADOOP-18340: --- Author: ASF GitHub Bot Created on: 27/Jul/22 17:21 Start Date: 27/Jul/22 17:21 Worklog Time Spent: 10m Work Description: huaxiangsun commented on PR #4608: URL: https://github.com/apache/hadoop/pull/4608#issuecomment-1197071839 Hi @steveloughran, can you take a look at the patch? Thanks. Issue Time Tracking --- Worklog Id: (was: 795768) Time Spent: 1h (was: 50m) > deleteOnExit does not work with S3AFileSystem > - > > Key: HADOOP-18340 > URL: https://issues.apache.org/jira/browse/HADOOP-18340 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.3.3 >Reporter: Huaxiang Sun >Priority: Minor > Labels: pull-request-available > Time Spent: 1h > Remaining Estimate: 0h > > When deleteOnExit is set on some paths, they are not removed when file system > object is closed. The following exception is logged when printing out the > exception in info log. > {code:java} > 2022-07-15 19:29:12,552 [main] INFO fs.FileSystem > (FileSystem.java:processDeleteOnExit(1810)) - Ignoring failure to > deleteOnExit for path /file, exception {} > java.io.IOException: s3a://mock-bucket: FileSystem is closed! > at > org.apache.hadoop.fs.s3a.S3AFileSystem.checkNotClosed(S3AFileSystem.java:3887) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2333) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2355) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.exists(S3AFileSystem.java:4402) > at > org.apache.hadoop.fs.FileSystem.processDeleteOnExit(FileSystem.java:1805) > at org.apache.hadoop.fs.FileSystem.close(FileSystem.java:2669) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.close(S3AFileSystem.java:3830) > at > org.apache.hadoop.fs.s3a.TestS3AGetFileStatus.testFile(TestS3AGetFileStatus.java:87) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) > at > org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) > at > org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) > at > org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) > at > org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) > at > org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) > at > org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:258) > at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) > at > org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100) > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366) > at > org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103) > at > org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63) > at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) > at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) > at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) > at org.junit.runners.ParentRunner.run(ParentRunner.java:413) > at > org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365) > at > org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273) > at > org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238) > at > org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159) > at > org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:384) > at >
[GitHub] [hadoop] huaxiangsun commented on pull request #4608: HADOOP-18340 deleteOnExit does not work with S3AFileSystem
huaxiangsun commented on PR #4608: URL: https://github.com/apache/hadoop/pull/4608#issuecomment-1197071839 Hi @steveloughran, can you take a look at the patch? Thanks. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4632: YARN-5871. [RESERVATION] Add support for reservation-based routing.
hadoop-yetus commented on PR #4632: URL: https://github.com/apache/hadoop/pull/4632#issuecomment-1197070874 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 58s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | buf | 0m 1s | | buf was not available. | | +0 :ok: | buf | 0m 1s | | buf was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 10 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 48s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 28m 21s | | trunk passed | | +1 :green_heart: | compile | 4m 13s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 3m 25s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 30s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 2s | | trunk passed | | +1 :green_heart: | javadoc | 1m 50s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 35s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 48s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 53s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 30s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 46s | | the patch passed | | +1 :green_heart: | compile | 4m 48s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | cc | 4m 48s | | the patch passed | | -1 :x: | javac | 4m 48s | [/results-compile-javac-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4632/5/artifact/out/results-compile-javac-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 3 new + 444 unchanged - 0 fixed = 447 total (was 444) | | +1 :green_heart: | compile | 3m 31s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | cc | 3m 31s | | the patch passed | | -1 :x: | javac | 3m 31s | [/results-compile-javac-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4632/5/artifact/out/results-compile-javac-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 generated 3 new + 368 unchanged - 0 fixed = 371 total (was 368) | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 13s | [/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4632/5/artifact/out/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server.txt) | hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server: The patch generated 17 new + 13 unchanged - 0 fixed = 30 total (was 13) | | +1 :green_heart: | mvnsite | 1m 41s | | the patch passed | | +1 :green_heart: | javadoc | 1m 23s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 16s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 39s | | the patch passed | | +1 :green_heart: | shadedclient | 24m 30s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 48s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | unit
[jira] [Commented] (HADOOP-18079) Upgrade Netty to 4.1.77.Final
[ https://issues.apache.org/jira/browse/HADOOP-18079?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17572041#comment-17572041 ] Steve Loughran commented on HADOOP-18079: - can you do this for branch 3.3.4? with this and the aws sdk update I could do the next RC > Upgrade Netty to 4.1.77.Final > - > > Key: HADOOP-18079 > URL: https://issues.apache.org/jira/browse/HADOOP-18079 > Project: Hadoop Common > Issue Type: Bug > Components: build >Affects Versions: 3.3.3 >Reporter: Renukaprasad C >Assignee: Wei-Chiu Chuang >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.9, 3.2.5 > > Time Spent: 5h 20m > Remaining Estimate: 0h > > h4. Netty version - 4.1.71 has fix some CVEs. > CVE-2019-20444, > CVE-2019-20445 > CVE-2022-24823 > Upgrade to latest version. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4597: HDFS-16671. RBF: RouterRpcFairnessPolicyController supports configurable permit acquire timeout
hadoop-yetus commented on PR #4597: URL: https://github.com/apache/hadoop/pull/4597#issuecomment-1197060602 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 16s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 41m 55s | | trunk passed | | +1 :green_heart: | compile | 1m 3s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 0m 57s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 0m 47s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 57s | | trunk passed | | +1 :green_heart: | javadoc | 1m 7s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 13s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 2m 3s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 20s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 41s | | the patch passed | | +1 :green_heart: | compile | 0m 44s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 0m 44s | | the patch passed | | +1 :green_heart: | compile | 0m 40s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 0m 40s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 26s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 43s | | the patch passed | | +1 :green_heart: | javadoc | 0m 41s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 0m 58s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 1m 31s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 18s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 22m 11s | | hadoop-hdfs-rbf in the patch passed. | | +1 :green_heart: | asflicense | 0m 47s | | The patch does not generate ASF License warnings. | | | | 126m 54s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4597/6/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4597 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux 58ab2853a7b1 4.15.0-169-generic #177-Ubuntu SMP Thu Feb 3 10:50:38 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / e5436d5a832fb785127e8ab33519820efdb8b564 | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4597/6/testReport/ | | Max. process+thread count | 2242 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-rbf U: hadoop-hdfs-project/hadoop-hdfs-rbf | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4597/6/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific
[jira] [Work logged] (HADOOP-18344) AWS SDK update to 1.12.262 to address jackson CVE-2018-7489
[ https://issues.apache.org/jira/browse/HADOOP-18344?focusedWorklogId=795763=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795763 ] ASF GitHub Bot logged work on HADOOP-18344: --- Author: ASF GitHub Bot Created on: 27/Jul/22 17:09 Start Date: 27/Jul/22 17:09 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on PR #4637: URL: https://github.com/apache/hadoop/pull/4637#issuecomment-1197057816 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 3s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +0 :ok: | markdownlint | 0m 1s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 52s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 28m 10s | | trunk passed | | +1 :green_heart: | compile | 24m 36s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 20m 46s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | mvnsite | 2m 41s | | trunk passed | | +1 :green_heart: | javadoc | 2m 24s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 27s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 117m 43s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 50s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 0s | | the patch passed | | +1 :green_heart: | compile | 24m 6s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 24m 6s | | the patch passed | | +1 :green_heart: | compile | 22m 46s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 22m 46s | | the patch passed | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4637/2/artifact/out/blanks-eol.txt) | The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | mvnsite | 2m 19s | | the patch passed | | +1 :green_heart: | javadoc | 1m 57s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 9s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 32m 22s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 1m 3s | | hadoop-project in the patch passed. | | +1 :green_heart: | unit | 3m 21s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 17s | | The patch does not generate ASF License warnings. | | | | 207m 35s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4637/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4637 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint markdownlint | | uname | Linux 4d5009939df0 4.15.0-65-generic #74-Ubuntu SMP Tue Sep 17 17:06:04 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 9800dd2f1d5ee5d6fd728aae748f6dd7d7897faa | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1
[GitHub] [hadoop] hadoop-yetus commented on pull request #4637: HADOOP-18344. Upgrade AWS SDK to 1.12.262
hadoop-yetus commented on PR #4637: URL: https://github.com/apache/hadoop/pull/4637#issuecomment-1197057816 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 3s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +0 :ok: | markdownlint | 0m 1s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 14m 52s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 28m 10s | | trunk passed | | +1 :green_heart: | compile | 24m 36s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 20m 46s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | mvnsite | 2m 41s | | trunk passed | | +1 :green_heart: | javadoc | 2m 24s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 27s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 117m 43s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 50s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 0s | | the patch passed | | +1 :green_heart: | compile | 24m 6s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javac | 24m 6s | | the patch passed | | +1 :green_heart: | compile | 22m 46s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | javac | 22m 46s | | the patch passed | | -1 :x: | blanks | 0m 0s | [/blanks-eol.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4637/2/artifact/out/blanks-eol.txt) | The patch has 1 line(s) that end in blanks. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | mvnsite | 2m 19s | | the patch passed | | +1 :green_heart: | javadoc | 1m 57s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 2m 9s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | shadedclient | 32m 22s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 1m 3s | | hadoop-project in the patch passed. | | +1 :green_heart: | unit | 3m 21s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 1m 17s | | The patch does not generate ASF License warnings. | | | | 207m 35s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4637/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4637 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint markdownlint | | uname | Linux 4d5009939df0 4.15.0-65-generic #74-Ubuntu SMP Tue Sep 17 17:06:04 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 9800dd2f1d5ee5d6fd728aae748f6dd7d7897faa | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4637/2/testReport/ | | Max. process+thread count | 719 (vs. ulimit of 5500) | | modules | C: hadoop-project hadoop-tools/hadoop-aws U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4637/2/console | | versions | git=2.25.1
[jira] [Commented] (HADOOP-18372) ILoadTestS3ABulkDeleteThrottling failing
[ https://issues.apache.org/jira/browse/HADOOP-18372?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17572035#comment-17572035 ] Steve Loughran commented on HADOOP-18372: - fixed. FWIW i am surprised this ever worked. Probably not been run for a while, which is what comes from a test you have to remember to manually invoke > ILoadTestS3ABulkDeleteThrottling failing > > > Key: HADOOP-18372 > URL: https://issues.apache.org/jira/browse/HADOOP-18372 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3, test >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Assignee: Ahmar Suhail >Priority: Minor > Labels: pull-request-available > Fix For: 3.3.9 > > Time Spent: 1h 50m > Remaining Estimate: 0h > > the test ILoadTestS3ABulkDeleteThrottling; looks like the fs config is being > set up too late in the test suite. it should be moved from setup to createConf -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Resolved] (HADOOP-18372) ILoadTestS3ABulkDeleteThrottling failing
[ https://issues.apache.org/jira/browse/HADOOP-18372?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Steve Loughran resolved HADOOP-18372. - Fix Version/s: 3.3.9 Resolution: Fixed > ILoadTestS3ABulkDeleteThrottling failing > > > Key: HADOOP-18372 > URL: https://issues.apache.org/jira/browse/HADOOP-18372 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3, test >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Assignee: Ahmar Suhail >Priority: Minor > Labels: pull-request-available > Fix For: 3.3.9 > > Time Spent: 1h 50m > Remaining Estimate: 0h > > the test ILoadTestS3ABulkDeleteThrottling; looks like the fs config is being > set up too late in the test suite. it should be moved from setup to createConf -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18372) ILoadTestS3ABulkDeleteThrottling failing
[ https://issues.apache.org/jira/browse/HADOOP-18372?focusedWorklogId=795762=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795762 ] ASF GitHub Bot logged work on HADOOP-18372: --- Author: ASF GitHub Bot Created on: 27/Jul/22 17:04 Start Date: 27/Jul/22 17:04 Worklog Time Spent: 10m Work Description: steveloughran commented on PR #4642: URL: https://github.com/apache/hadoop/pull/4642#issuecomment-1197051767 tested on branch-3.3; all good so merging there too Issue Time Tracking --- Worklog Id: (was: 795762) Time Spent: 1h 50m (was: 1h 40m) > ILoadTestS3ABulkDeleteThrottling failing > > > Key: HADOOP-18372 > URL: https://issues.apache.org/jira/browse/HADOOP-18372 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3, test >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Assignee: Ahmar Suhail >Priority: Minor > Labels: pull-request-available > Time Spent: 1h 50m > Remaining Estimate: 0h > > the test ILoadTestS3ABulkDeleteThrottling; looks like the fs config is being > set up too late in the test suite. it should be moved from setup to createConf -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #4642: HADOOP-18372. ILoadTestS3ABulkDeleteThrottling failing.
steveloughran commented on PR #4642: URL: https://github.com/apache/hadoop/pull/4642#issuecomment-1197051767 tested on branch-3.3; all good so merging there too -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4587: YARN-11200 numa support in branch-2.10
hadoop-yetus commented on PR #4587: URL: https://github.com/apache/hadoop/pull/4587#issuecomment-1197043183 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 8m 34s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | xmllint | 0m 1s | | xmllint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ branch-2.10 Compile Tests _ | | +0 :ok: | mvndep | 3m 30s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 13m 23s | | branch-2.10 passed | | +1 :green_heart: | compile | 7m 35s | | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | compile | 6m 40s | | branch-2.10 passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | checkstyle | 1m 39s | | branch-2.10 passed | | +1 :green_heart: | mvnsite | 3m 43s | | branch-2.10 passed | | +1 :green_heart: | javadoc | 3m 38s | | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 3m 17s | | branch-2.10 passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | spotbugs | 6m 5s | | branch-2.10 passed | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 29s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 52s | | the patch passed | | +1 :green_heart: | compile | 6m 47s | | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javac | 6m 47s | | the patch passed | | +1 :green_heart: | compile | 6m 35s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | javac | 6m 35s | | the patch passed | | +1 :green_heart: | blanks | 0m 1s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 29s | | the patch passed | | +1 :green_heart: | mvnsite | 3m 17s | | the patch passed | | +1 :green_heart: | javadoc | 3m 10s | | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 2m 54s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | +1 :green_heart: | spotbugs | 5m 52s | | the patch passed | _ Other Tests _ | | +1 :green_heart: | unit | 1m 11s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 3m 50s | | hadoop-yarn-common in the patch passed. | | -1 :x: | unit | 16m 6s | [/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4587/8/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager.txt) | hadoop-yarn-server-nodemanager in the patch passed. | | +1 :green_heart: | asflicense | 1m 7s | | The patch does not generate ASF License warnings. | | | | 120m 11s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.yarn.server.nodemanager.amrmproxy.TestFederationInterceptor | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4587/8/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/4587 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint | | uname | Linux 90c6fa5a29fc 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-2.10 / c12f934b31c0f181aa425f12730455065b0a012d | | Default Java | Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Multi-JDK versions | /usr/lib/jvm/zulu-7-amd64:Azul Systems, Inc.-1.7.0_262-b10 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~18.04-b07 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4587/8/testReport/ | | Max. process+thread count | 178 (vs. ulimit of 5500) | | modules | C:
[jira] [Created] (HADOOP-18374) DistCP: Aggregate IOStatistics Counters in MapReduce Counters
Steve Loughran created HADOOP-18374: --- Summary: DistCP: Aggregate IOStatistics Counters in MapReduce Counters Key: HADOOP-18374 URL: https://issues.apache.org/jira/browse/HADOOP-18374 Project: Hadoop Common Issue Type: Sub-task Components: tools/distcp Affects Versions: 3.3.9 Reporter: Steve Loughran Assignee: Mehakmeet Singh Distcp can collect IOStatisticsContext counter values and report them to the console. it can't do the timings in min/mean/max though, as there's no way to aggregate them properly. # Publish statistics to MapReduce counters in the tasks within CopyMapper.copyFileWithRetry(). # The counters will be automatically logged in Job.monitorAndPrintJob() when DistCp is executed with the -verbose option; no need for changes there. # We could also publish the iOStatistic means by publishing sample count and total sum as two separate counters # In AbstractContractDistCpTest, add an override point for subclasses to list which metrics they will issue; assert that values are generated. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18344) AWS SDK update to 1.12.262 to address jackson CVE-2018-7489
[ https://issues.apache.org/jira/browse/HADOOP-18344?focusedWorklogId=795747=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795747 ] ASF GitHub Bot logged work on HADOOP-18344: --- Author: ASF GitHub Bot Created on: 27/Jul/22 16:26 Start Date: 27/Jul/22 16:26 Worklog Time Spent: 10m Work Description: steveloughran commented on PR #4646: URL: https://github.com/apache/hadoop/pull/4646#issuecomment-1196979143 test failure due to the markdown test fix not backported ``` [ERROR] testRunLimitedLandsatAudit(org.apache.hadoop.fs.s3a.tools.ITestMarkerTool) Time elapsed: 2.484 s <<< FAILURE! java.lang.AssertionError: Expected an exception of type class org.apache.hadoop.util.ExitUtil$ExitException at org.apache.hadoop.test.LambdaTestUtils.intercept(LambdaTestUtils.java:409) at org.apache.hadoop.fs.s3a.s3guard.S3GuardToolTestHelper.runS3GuardCommandToFailure(S3GuardToolTestHelper.java:163) at org.apache.hadoop.fs.s3a.tools.AbstractMarkerToolTest.runToFailure(AbstractMarkerToolTest.java:276) at org.apache.hadoop.fs.s3a.tools.ITestMarkerTool.testRunLimitedLandsatAudit(ITestMarkerTool.java:320) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ``` Issue Time Tracking --- Worklog Id: (was: 795747) Time Spent: 2h 10m (was: 2h) > AWS SDK update to 1.12.262 to address jackson CVE-2018-7489 > > > Key: HADOOP-18344 > URL: https://issues.apache.org/jira/browse/HADOOP-18344 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.4.0, 3.3.4 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 2h 10m > Remaining Estimate: 0h > > yet another jackson CVE in aws sdk > https://github.com/apache/hadoop/pull/4491/commits/5496816b472473eb7a9c174b7d3e69b6eee1e271 > maybe we need to have a list of all shaded jackson's we get on the CP and > have a process of upgrading them all at the same time -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #4646: HADOOP-18344. Upgrade AWS SDK to 1.12.262
steveloughran commented on PR #4646: URL: https://github.com/apache/hadoop/pull/4646#issuecomment-1196979143 test failure due to the markdown test fix not backported ``` [ERROR] testRunLimitedLandsatAudit(org.apache.hadoop.fs.s3a.tools.ITestMarkerTool) Time elapsed: 2.484 s <<< FAILURE! java.lang.AssertionError: Expected an exception of type class org.apache.hadoop.util.ExitUtil$ExitException at org.apache.hadoop.test.LambdaTestUtils.intercept(LambdaTestUtils.java:409) at org.apache.hadoop.fs.s3a.s3guard.S3GuardToolTestHelper.runS3GuardCommandToFailure(S3GuardToolTestHelper.java:163) at org.apache.hadoop.fs.s3a.tools.AbstractMarkerToolTest.runToFailure(AbstractMarkerToolTest.java:276) at org.apache.hadoop.fs.s3a.tools.ITestMarkerTool.testRunLimitedLandsatAudit(ITestMarkerTool.java:320) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #4632: YARN-5871. [RESERVATION] Add support for reservation-based routing.
hadoop-yetus commented on PR #4632: URL: https://github.com/apache/hadoop/pull/4632#issuecomment-1196976712 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 51s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +0 :ok: | detsecrets | 0m 1s | | detect-secrets was not available. | | +0 :ok: | buf | 0m 1s | | buf was not available. | | +0 :ok: | buf | 0m 1s | | buf was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 10 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 9s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 25m 21s | | trunk passed | | +1 :green_heart: | compile | 4m 3s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | compile | 3m 28s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | checkstyle | 1m 30s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 18s | | trunk passed | | +1 :green_heart: | javadoc | 2m 8s | | trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 54s | | trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 53s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 43s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 31s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 40s | | the patch passed | | +1 :green_heart: | compile | 3m 47s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | cc | 3m 47s | | the patch passed | | -1 :x: | javac | 3m 47s | [/results-compile-javac-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4632/4/artifact/out/results-compile-javac-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1.txt) | hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 generated 3 new + 444 unchanged - 0 fixed = 447 total (was 444) | | +1 :green_heart: | compile | 3m 17s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | cc | 3m 17s | | the patch passed | | -1 :x: | javac | 3m 17s | [/results-compile-javac-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4632/4/artifact/out/results-compile-javac-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07.txt) | hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkPrivateBuild-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 generated 3 new + 368 unchanged - 0 fixed = 371 total (was 368) | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 14s | [/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4632/4/artifact/out/results-checkstyle-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server.txt) | hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server: The patch generated 16 new + 13 unchanged - 0 fixed = 29 total (was 13) | | +1 :green_heart: | mvnsite | 1m 49s | | the patch passed | | +1 :green_heart: | javadoc | 1m 29s | | the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 | | +1 :green_heart: | javadoc | 1m 26s | | the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07 | | +1 :green_heart: | spotbugs | 3m 40s | | the patch passed | | +1 :green_heart: | shadedclient | 21m 15s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 3m 2s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | unit
[jira] [Work logged] (HADOOP-18372) ILoadTestS3ABulkDeleteThrottling failing
[ https://issues.apache.org/jira/browse/HADOOP-18372?focusedWorklogId=795741=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795741 ] ASF GitHub Bot logged work on HADOOP-18372: --- Author: ASF GitHub Bot Created on: 27/Jul/22 16:19 Start Date: 27/Jul/22 16:19 Worklog Time Spent: 10m Work Description: steveloughran commented on PR #4642: URL: https://github.com/apache/hadoop/pull/4642#issuecomment-1196970560 thanks Issue Time Tracking --- Worklog Id: (was: 795741) Time Spent: 1.5h (was: 1h 20m) > ILoadTestS3ABulkDeleteThrottling failing > > > Key: HADOOP-18372 > URL: https://issues.apache.org/jira/browse/HADOOP-18372 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3, test >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Assignee: Ahmar Suhail >Priority: Minor > Labels: pull-request-available > Time Spent: 1.5h > Remaining Estimate: 0h > > the test ILoadTestS3ABulkDeleteThrottling; looks like the fs config is being > set up too late in the test suite. it should be moved from setup to createConf -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18372) ILoadTestS3ABulkDeleteThrottling failing
[ https://issues.apache.org/jira/browse/HADOOP-18372?focusedWorklogId=795742=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795742 ] ASF GitHub Bot logged work on HADOOP-18372: --- Author: ASF GitHub Bot Created on: 27/Jul/22 16:19 Start Date: 27/Jul/22 16:19 Worklog Time Spent: 10m Work Description: steveloughran merged PR #4642: URL: https://github.com/apache/hadoop/pull/4642 Issue Time Tracking --- Worklog Id: (was: 795742) Time Spent: 1h 40m (was: 1.5h) > ILoadTestS3ABulkDeleteThrottling failing > > > Key: HADOOP-18372 > URL: https://issues.apache.org/jira/browse/HADOOP-18372 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3, test >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Assignee: Ahmar Suhail >Priority: Minor > Labels: pull-request-available > Time Spent: 1h 40m > Remaining Estimate: 0h > > the test ILoadTestS3ABulkDeleteThrottling; looks like the fs config is being > set up too late in the test suite. it should be moved from setup to createConf -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran merged pull request #4642: HADOOP-18372. ILoadTestS3ABulkDeleteThrottling failing.
steveloughran merged PR #4642: URL: https://github.com/apache/hadoop/pull/4642 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #4642: HADOOP-18372. ILoadTestS3ABulkDeleteThrottling failing.
steveloughran commented on PR #4642: URL: https://github.com/apache/hadoop/pull/4642#issuecomment-1196970560 thanks -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-18344) AWS SDK update to 1.12.262 to address jackson CVE-2018-7489
[ https://issues.apache.org/jira/browse/HADOOP-18344?focusedWorklogId=795740=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-795740 ] ASF GitHub Bot logged work on HADOOP-18344: --- Author: ASF GitHub Bot Created on: 27/Jul/22 16:18 Start Date: 27/Jul/22 16:18 Worklog Time Spent: 10m Work Description: steveloughran opened a new pull request, #4646: URL: https://github.com/apache/hadoop/pull/4646 Fixes CVE-2018-7489 in shaded jackson. +Add more commands in testing.md to the CLI tests needed when qualifying a release ### Description of PR #4637 / #4645 on branch-3.3.3 ### How was this patch tested? tests in progress ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? Issue Time Tracking --- Worklog Id: (was: 795740) Time Spent: 2h (was: 1h 50m) > AWS SDK update to 1.12.262 to address jackson CVE-2018-7489 > > > Key: HADOOP-18344 > URL: https://issues.apache.org/jira/browse/HADOOP-18344 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.4.0, 3.3.4 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 2h > Remaining Estimate: 0h > > yet another jackson CVE in aws sdk > https://github.com/apache/hadoop/pull/4491/commits/5496816b472473eb7a9c174b7d3e69b6eee1e271 > maybe we need to have a list of all shaded jackson's we get on the CP and > have a process of upgrading them all at the same time -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran opened a new pull request, #4646: HADOOP-18344. Upgrade AWS SDK to 1.12.262
steveloughran opened a new pull request, #4646: URL: https://github.com/apache/hadoop/pull/4646 Fixes CVE-2018-7489 in shaded jackson. +Add more commands in testing.md to the CLI tests needed when qualifying a release ### Description of PR #4637 / #4645 on branch-3.3.3 ### How was this patch tested? tests in progress ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org