[GitHub] [hadoop] hadoop-yetus commented on pull request #848: YARN-9579:the property of sharedcache in mapred-default.xml
hadoop-yetus commented on pull request #848: URL: https://github.com/apache/hadoop/pull/848#issuecomment-937473819 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 14s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 45m 11s | | trunk passed | | +1 :green_heart: | compile | 0m 39s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 0m 35s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | mvnsite | 0m 41s | | trunk passed | | +1 :green_heart: | javadoc | 0m 24s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 20s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | shadedclient | 70m 33s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 36s | | the patch passed | | +1 :green_heart: | compile | 0m 34s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 0m 34s | | the patch passed | | +1 :green_heart: | compile | 0m 29s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 0m 29s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 35s | | the patch passed | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 0m 15s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 15s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | shadedclient | 22m 24s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 5m 43s | | hadoop-mapreduce-client-core in the patch passed. | | +1 :green_heart: | asflicense | 0m 28s | | The patch does not generate ASF License warnings. | | | | 104m 18s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-848/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/848 | | JIRA Issue | YARN-9579 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell xml | | uname | Linux 28115a0a5dc3 4.15.0-142-generic #146-Ubuntu SMP Tue Apr 13 01:11:19 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 8d27fad76ea3e8b5689a87ebcc97177d280d0753 | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-848/1/testReport/ | | Max. process+thread count | 1082 (vs. ulimit of 5500) | | modules | C: hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core U: hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-848/1/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Work logged] (HADOOP-17915) ABFS AbfsDelegationTokenManager to generate canonicalServiceName if DT plugin doesn't
[ https://issues.apache.org/jira/browse/HADOOP-17915?focusedWorklogId=661323=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-661323 ] ASF GitHub Bot logged work on HADOOP-17915: --- Author: ASF GitHub Bot Created on: 07/Oct/21 05:22 Start Date: 07/Oct/21 05:22 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3442: URL: https://github.com/apache/hadoop/pull/3442#issuecomment-937460425 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 21m 36s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 50m 45s | | trunk passed | | +1 :green_heart: | compile | 0m 48s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 0m 39s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 0m 27s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 44s | | trunk passed | | +1 :green_heart: | javadoc | 0m 34s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 40s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 1m 29s | | trunk passed | | +1 :green_heart: | shadedclient | 24m 49s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 35s | | the patch passed | | +1 :green_heart: | compile | 0m 31s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 0m 31s | | the patch passed | | +1 :green_heart: | compile | 0m 26s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 0m 26s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 18s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 33s | | the patch passed | | +1 :green_heart: | javadoc | 0m 22s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 20s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 1m 4s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 10s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 1m 58s | | hadoop-azure in the patch passed. | | +1 :green_heart: | asflicense | 0m 28s | | The patch does not generate ASF License warnings. | | | | 132m 3s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3442/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3442 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux eea89016fee6 4.15.0-147-generic #151-Ubuntu SMP Fri Jun 18 19:21:19 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 42132967798fdc82b6609796da78220432381eb2 | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3442/1/testReport/ | | Max. process+thread count | 560 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-azure U: hadoop-tools/hadoop-azure | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3442/1/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was
[GitHub] [hadoop] hadoop-yetus commented on pull request #3442: HADOOP-17915. ABFS AbfsDelegationTokenManager to generate canonicalServiceName if DT plugin doesn't
hadoop-yetus commented on pull request #3442: URL: https://github.com/apache/hadoop/pull/3442#issuecomment-937460425 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 21m 36s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 50m 45s | | trunk passed | | +1 :green_heart: | compile | 0m 48s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 0m 39s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 0m 27s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 44s | | trunk passed | | +1 :green_heart: | javadoc | 0m 34s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 40s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 1m 29s | | trunk passed | | +1 :green_heart: | shadedclient | 24m 49s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 35s | | the patch passed | | +1 :green_heart: | compile | 0m 31s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 0m 31s | | the patch passed | | +1 :green_heart: | compile | 0m 26s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 0m 26s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 18s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 33s | | the patch passed | | +1 :green_heart: | javadoc | 0m 22s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 20s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 1m 4s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 10s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 1m 58s | | hadoop-azure in the patch passed. | | +1 :green_heart: | asflicense | 0m 28s | | The patch does not generate ASF License warnings. | | | | 132m 3s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3442/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3442 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux eea89016fee6 4.15.0-147-generic #151-Ubuntu SMP Fri Jun 18 19:21:19 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 42132967798fdc82b6609796da78220432381eb2 | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3442/1/testReport/ | | Max. process+thread count | 560 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-azure U: hadoop-tools/hadoop-azure | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3442/1/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Updated] (HADOOP-17950) Provide replacement for deprecated APIs of commons-io IOUtils
[ https://issues.apache.org/jira/browse/HADOOP-17950?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Akira Ajisaka updated HADOOP-17950: --- Fix Version/s: 3.3.2 3.4.0 Resolution: Fixed Status: Resolved (was: Patch Available) Committed to trunk and branch-3.3. > Provide replacement for deprecated APIs of commons-io IOUtils > - > > Key: HADOOP-17950 > URL: https://issues.apache.org/jira/browse/HADOOP-17950 > Project: Hadoop Common > Issue Type: Task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.2 > > Time Spent: 1h 40m > Remaining Estimate: 0h > > Replace deprecated API usage of commons-io IOUtils after we have upgraded > commons-io as part of HADOOP-17683. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3527: HDFS-16262 Async refresh of cached locations in DFSInputStream
hadoop-yetus commented on pull request #3527: URL: https://github.com/apache/hadoop/pull/3527#issuecomment-937438221 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 43s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 2 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 12m 51s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 20m 57s | | trunk passed | | +1 :green_heart: | compile | 4m 56s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 4m 38s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 1m 15s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 22s | | trunk passed | | +1 :green_heart: | javadoc | 1m 40s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 2m 6s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 5m 30s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 3s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 27s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 0s | | the patch passed | | +1 :green_heart: | compile | 5m 2s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 5m 2s | | the patch passed | | +1 :green_heart: | compile | 4m 37s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 4m 37s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 1m 9s | [/results-checkstyle-hadoop-hdfs-project.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3527/1/artifact/out/results-checkstyle-hadoop-hdfs-project.txt) | hadoop-hdfs-project: The patch generated 15 new + 105 unchanged - 0 fixed = 120 total (was 105) | | +1 :green_heart: | mvnsite | 2m 4s | | the patch passed | | +1 :green_heart: | javadoc | 1m 23s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 1m 53s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 5m 39s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 59s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 2m 22s | | hadoop-hdfs-client in the patch passed. | | -1 :x: | unit | 231m 45s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3527/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt) | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 46s | | The patch does not generate ASF License warnings. | | | | 356m 40s | | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.tools.TestHdfsConfigFields | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3527/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3527 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 3b643776302b 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / a0c230195c21d66b648768406ebd1042a9ed3558 | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3527/1/testReport/ | | Max. process+thread count | 3667
[GitHub] [hadoop] hadoop-yetus commented on pull request #1997: YARN-10250 - find: File system loop detected - list debug force true
hadoop-yetus commented on pull request #1997: URL: https://github.com/apache/hadoop/pull/1997#issuecomment-937433118 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 48s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 32m 10s | | trunk passed | | +1 :green_heart: | compile | 1m 30s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 1m 26s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 0m 34s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 48s | | trunk passed | | +1 :green_heart: | javadoc | 0m 44s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 36s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 1m 21s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 59s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 39s | | the patch passed | | +1 :green_heart: | compile | 1m 26s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 1m 26s | | the patch passed | | +1 :green_heart: | compile | 1m 22s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 1m 22s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 24s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 39s | | the patch passed | | +1 :green_heart: | javadoc | 0m 31s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 28s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 1m 32s | | the patch passed | | +1 :green_heart: | shadedclient | 21m 1s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 23m 8s | | hadoop-yarn-server-nodemanager in the patch passed. | | +1 :green_heart: | asflicense | 0m 34s | | The patch does not generate ASF License warnings. | | | | 113m 0s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-1997/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/1997 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 2239fadec39f 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 1dadb816117d77e19c3aada9159ef8077ac276f6 | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-1997/1/testReport/ | | Max. process+thread count | 718 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager U: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-1997/1/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above
[jira] [Work logged] (HADOOP-17559) S3Guard import can OOM on large imports
[ https://issues.apache.org/jira/browse/HADOOP-17559?focusedWorklogId=661302=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-661302 ] ASF GitHub Bot logged work on HADOOP-17559: --- Author: ASF GitHub Bot Created on: 07/Oct/21 04:18 Start Date: 07/Oct/21 04:18 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #2734: URL: https://github.com/apache/hadoop/pull/2734#issuecomment-937431834 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 6s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 36m 2s | | trunk passed | | +1 :green_heart: | compile | 0m 45s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 0m 36s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 0m 27s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 43s | | trunk passed | | +1 :green_heart: | javadoc | 0m 22s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 31s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 1m 12s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 46s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 35s | | the patch passed | | +1 :green_heart: | compile | 0m 38s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 0m 38s | | the patch passed | | +1 :green_heart: | compile | 0m 30s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 0m 30s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 19s | [/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2734/1/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt) | hadoop-tools/hadoop-aws: The patch generated 2 new + 3 unchanged - 0 fixed = 5 total (was 3) | | +1 :green_heart: | mvnsite | 0m 37s | | the patch passed | | +1 :green_heart: | javadoc | 0m 16s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 26s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 1m 29s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 36s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 3m 6s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 42s | | The patch does not generate ASF License warnings. | | | | 97m 0s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2734/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2734 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 2286e6527b7f 4.15.0-143-generic #147-Ubuntu SMP Wed Apr 14 16:10:11 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / f697764225646391520cb452124a609f77ef2c7f | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2734/1/testReport/ | | Max. process+thread count | 607 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | |
[GitHub] [hadoop] hadoop-yetus commented on pull request #2734: HADOOP-17559. S3guard import OOM.
hadoop-yetus commented on pull request #2734: URL: https://github.com/apache/hadoop/pull/2734#issuecomment-937431834 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 6s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 1 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 36m 2s | | trunk passed | | +1 :green_heart: | compile | 0m 45s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 0m 36s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 0m 27s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 43s | | trunk passed | | +1 :green_heart: | javadoc | 0m 22s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 31s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 1m 12s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 46s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 35s | | the patch passed | | +1 :green_heart: | compile | 0m 38s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 0m 38s | | the patch passed | | +1 :green_heart: | compile | 0m 30s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 0m 30s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | -0 :warning: | checkstyle | 0m 19s | [/results-checkstyle-hadoop-tools_hadoop-aws.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2734/1/artifact/out/results-checkstyle-hadoop-tools_hadoop-aws.txt) | hadoop-tools/hadoop-aws: The patch generated 2 new + 3 unchanged - 0 fixed = 5 total (was 3) | | +1 :green_heart: | mvnsite | 0m 37s | | the patch passed | | +1 :green_heart: | javadoc | 0m 16s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 26s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 1m 29s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 36s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 3m 6s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 42s | | The patch does not generate ASF License warnings. | | | | 97m 0s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2734/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2734 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 2286e6527b7f 4.15.0-143-generic #147-Ubuntu SMP Wed Apr 14 16:10:11 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / f697764225646391520cb452124a609f77ef2c7f | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2734/1/testReport/ | | Max. process+thread count | 607 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2734/1/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the
[GitHub] [hadoop] hadoop-yetus commented on pull request #881: YARN-2774. support secure clusters in shared cache manager
hadoop-yetus commented on pull request #881: URL: https://github.com/apache/hadoop/pull/881#issuecomment-937428176 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 9s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +0 :ok: | markdownlint | 0m 0s | | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 3 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 12m 37s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 23m 30s | | trunk passed | | +1 :green_heart: | compile | 23m 27s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 19m 43s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 3m 50s | | trunk passed | | +1 :green_heart: | mvnsite | 4m 41s | | trunk passed | | +1 :green_heart: | javadoc | 3m 54s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 4m 20s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 8m 28s | | trunk passed | | +1 :green_heart: | shadedclient | 22m 38s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 22s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 3m 1s | | the patch passed | | +1 :green_heart: | compile | 22m 26s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 22m 26s | | the patch passed | | +1 :green_heart: | compile | 19m 46s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 19m 46s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 3m 49s | | the patch passed | | +1 :green_heart: | mvnsite | 4m 40s | | the patch passed | | +1 :green_heart: | xml | 0m 1s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 3m 54s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 4m 18s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 9m 30s | | the patch passed | | +1 :green_heart: | shadedclient | 23m 27s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 17m 55s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 1m 9s | | hadoop-yarn-api in the patch passed. | | +1 :green_heart: | unit | 5m 8s | | hadoop-yarn-common in the patch passed. | | +1 :green_heart: | unit | 3m 12s | | hadoop-yarn-server-common in the patch passed. | | +1 :green_heart: | unit | 0m 51s | | hadoop-yarn-server-sharedcachemanager in the patch passed. | | +1 :green_heart: | asflicense | 0m 53s | | The patch does not generate ASF License warnings. | | | | 256m 59s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-881/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/881 | | Optional Tests | dupname asflicense mvnsite codespell markdownlint compile javac javadoc mvninstall unit shadedclient spotbugs checkstyle xml | | uname | Linux 62bb401fda4f 4.15.0-143-generic #147-Ubuntu SMP Wed Apr 14 16:10:11 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 8b67a904d429eaf6c7f72a31d6e8f56cc7de961f | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-881/2/testReport/ | | Max. process+thread count | 1385 (vs. ulimit of 5500) | | modules | C:
[GitHub] [hadoop] hadoop-yetus commented on pull request #3524: HDFS-16257. Set initialCapacity for guava cache to solve performance issue
hadoop-yetus commented on pull request #3524: URL: https://github.com/apache/hadoop/pull/3524#issuecomment-937425783 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Comment | |::|--:|:|:| | +0 :ok: | reexec | 9m 30s | Docker mode activated. | ||| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | ||| _ branch-2.10 Compile Tests _ | | +1 :green_heart: | mvninstall | 15m 25s | branch-2.10 passed | | +1 :green_heart: | compile | 0m 38s | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | compile | 0m 29s | branch-2.10 passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | +1 :green_heart: | checkstyle | 0m 20s | branch-2.10 passed | | +1 :green_heart: | mvnsite | 0m 37s | branch-2.10 passed | | +1 :green_heart: | javadoc | 0m 50s | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 0m 33s | branch-2.10 passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | +0 :ok: | spotbugs | 3m 36s | Both FindBugs and SpotBugs are enabled, using SpotBugs. | | +1 :green_heart: | spotbugs | 1m 9s | branch-2.10 passed | ||| _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 29s | the patch passed | | +1 :green_heart: | compile | 0m 29s | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javac | 0m 29s | the patch passed | | +1 :green_heart: | compile | 0m 25s | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | +1 :green_heart: | javac | 0m 25s | the patch passed | | +1 :green_heart: | checkstyle | 0m 14s | the patch passed | | +1 :green_heart: | mvnsite | 0m 30s | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | The patch has no whitespace issues. | | +1 :green_heart: | javadoc | 0m 41s | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 0m 30s | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | +1 :green_heart: | spotbugs | 1m 10s | the patch passed | ||| _ Other Tests _ | | +1 :green_heart: | unit | 17m 10s | hadoop-hdfs-rbf in the patch passed. | | +1 :green_heart: | asflicense | 0m 28s | The patch does not generate ASF License warnings. | | | | 55m 42s | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3524/4/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3524 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle | | uname | Linux bca894c0206c 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-2.10 / dc03afc | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | Multi-JDK versions | /usr/lib/jvm/zulu-7-amd64:Azul Systems, Inc.-1.7.0_262-b10 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3524/4/testReport/ | | Max. process+thread count | 1228 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-rbf U: hadoop-hdfs-project/hadoop-hdfs-rbf | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3524/4/console | | versions | git=2.7.4 maven=3.3.9 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.12.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #438: YARN-9009: Fix flaky test TestEntityGroupFSTimelineStore.testCleanLogs
hadoop-yetus commented on pull request #438: URL: https://github.com/apache/hadoop/pull/438#issuecomment-937418928 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 0s | | Docker mode activated. | | -1 :x: | patch | 0m 20s | | https://github.com/apache/hadoop/pull/438 does not apply to trunk. Rebase required? Wrong Branch? See https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute for help. | | Subsystem | Report/Notes | |--:|:-| | GITHUB PR | https://github.com/apache/hadoop/pull/438 | | JIRA Issue | YARN-9009 | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-438/1/console | | versions | git=2.17.1 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3524: HDFS-16257. Set initialCapacity for guava cache to solve performance issue
hadoop-yetus commented on pull request #3524: URL: https://github.com/apache/hadoop/pull/3524#issuecomment-937412397 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Comment | |::|--:|:|:| | +0 :ok: | reexec | 0m 39s | Docker mode activated. | ||| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | ||| _ branch-2.10 Compile Tests _ | | +1 :green_heart: | mvninstall | 15m 16s | branch-2.10 passed | | +1 :green_heart: | compile | 0m 33s | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | compile | 0m 27s | branch-2.10 passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | +1 :green_heart: | checkstyle | 0m 20s | branch-2.10 passed | | +1 :green_heart: | mvnsite | 0m 39s | branch-2.10 passed | | +1 :green_heart: | javadoc | 0m 49s | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 0m 33s | branch-2.10 passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | +0 :ok: | spotbugs | 3m 41s | Both FindBugs and SpotBugs are enabled, using SpotBugs. | | +1 :green_heart: | spotbugs | 1m 12s | branch-2.10 passed | ||| _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 31s | the patch passed | | +1 :green_heart: | compile | 0m 31s | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javac | 0m 31s | the patch passed | | +1 :green_heart: | compile | 0m 24s | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | +1 :green_heart: | javac | 0m 24s | the patch passed | | +1 :green_heart: | checkstyle | 0m 15s | the patch passed | | +1 :green_heart: | mvnsite | 0m 28s | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | The patch has no whitespace issues. | | +1 :green_heart: | javadoc | 0m 44s | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 0m 31s | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | +1 :green_heart: | spotbugs | 1m 11s | the patch passed | ||| _ Other Tests _ | | +1 :green_heart: | unit | 17m 15s | hadoop-hdfs-rbf in the patch passed. | | +1 :green_heart: | asflicense | 0m 26s | The patch does not generate ASF License warnings. | | | | 47m 4s | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3524/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3524 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle | | uname | Linux 3d82906a0f67 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-2.10 / dc03afc | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | Multi-JDK versions | /usr/lib/jvm/zulu-7-amd64:Azul Systems, Inc.-1.7.0_262-b10 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3524/3/testReport/ | | Max. process+thread count | 1445 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-rbf U: hadoop-hdfs-project/hadoop-hdfs-rbf | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3524/3/console | | versions | git=2.7.4 maven=3.3.9 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.12.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3524: HDFS-16257. Set initialCapacity for guava cache to solve performance issue
hadoop-yetus commented on pull request #3524: URL: https://github.com/apache/hadoop/pull/3524#issuecomment-937409811 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Comment | |::|--:|:|:| | +0 :ok: | reexec | 9m 33s | Docker mode activated. | ||| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | ||| _ branch-2.10 Compile Tests _ | | +1 :green_heart: | mvninstall | 15m 11s | branch-2.10 passed | | +1 :green_heart: | compile | 0m 35s | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | compile | 0m 28s | branch-2.10 passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | +1 :green_heart: | checkstyle | 0m 21s | branch-2.10 passed | | +1 :green_heart: | mvnsite | 0m 37s | branch-2.10 passed | | +1 :green_heart: | javadoc | 0m 49s | branch-2.10 passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 0m 34s | branch-2.10 passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | +0 :ok: | spotbugs | 3m 37s | Both FindBugs and SpotBugs are enabled, using SpotBugs. | | +1 :green_heart: | spotbugs | 1m 7s | branch-2.10 passed | ||| _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 28s | the patch passed | | +1 :green_heart: | compile | 0m 31s | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javac | 0m 31s | the patch passed | | +1 :green_heart: | compile | 0m 24s | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | +1 :green_heart: | javac | 0m 24s | the patch passed | | +1 :green_heart: | checkstyle | 0m 14s | the patch passed | | +1 :green_heart: | mvnsite | 0m 28s | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | The patch has no whitespace issues. | | +1 :green_heart: | javadoc | 0m 41s | the patch passed with JDK Azul Systems, Inc.-1.7.0_262-b10 | | +1 :green_heart: | javadoc | 0m 29s | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | +1 :green_heart: | spotbugs | 1m 11s | the patch passed | ||| _ Other Tests _ | | +1 :green_heart: | unit | 17m 10s | hadoop-hdfs-rbf in the patch passed. | | +1 :green_heart: | asflicense | 0m 50s | The patch does not generate ASF License warnings. | | | | 56m 0s | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3524/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3524 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle | | uname | Linux 356e4376ce9f 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | branch-2.10 / dc03afc | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | Multi-JDK versions | /usr/lib/jvm/zulu-7-amd64:Azul Systems, Inc.-1.7.0_262-b10 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~16.04.1-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3524/2/testReport/ | | Max. process+thread count | 1092 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-rbf U: hadoop-hdfs-project/hadoop-hdfs-rbf | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3524/2/console | | versions | git=2.7.4 maven=3.3.9 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.12.0 https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17952) Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
[ https://issues.apache.org/jira/browse/HADOOP-17952?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Takanobu Asanuma updated HADOOP-17952: -- Fix Version/s: 3.4.0 Resolution: Fixed Status: Resolved (was: Patch Available) > Replace Guava VisibleForTesting by Hadoop's own annotation in > hadoop-common-project modules > --- > > Key: HADOOP-17952 > URL: https://issues.apache.org/jira/browse/HADOOP-17952 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0 > > Time Spent: 4h 50m > Remaining Estimate: 0h > -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17952) Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
[ https://issues.apache.org/jira/browse/HADOOP-17952?focusedWorklogId=661284=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-661284 ] ASF GitHub Bot logged work on HADOOP-17952: --- Author: ASF GitHub Bot Created on: 07/Oct/21 02:43 Start Date: 07/Oct/21 02:43 Worklog Time Spent: 10m Work Description: tasanuma commented on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-937398236 I'm still not sure if VisibleForTesting should be replaced in the lower branches. If we do it, I think it's ok to do the replacements in all modules at once for the lower branches, instead of proceeding with the replacements in each individual module. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 661284) Time Spent: 4h 50m (was: 4h 40m) > Replace Guava VisibleForTesting by Hadoop's own annotation in > hadoop-common-project modules > --- > > Key: HADOOP-17952 > URL: https://issues.apache.org/jira/browse/HADOOP-17952 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > Time Spent: 4h 50m > Remaining Estimate: 0h > -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tasanuma commented on pull request #3503: HADOOP-17952. Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
tasanuma commented on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-937398236 I'm still not sure if VisibleForTesting should be replaced in the lower branches. If we do it, I think it's ok to do the replacements in all modules at once for the lower branches, instead of proceeding with the replacements in each individual module. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17952) Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
[ https://issues.apache.org/jira/browse/HADOOP-17952?focusedWorklogId=661282=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-661282 ] ASF GitHub Bot logged work on HADOOP-17952: --- Author: ASF GitHub Bot Created on: 07/Oct/21 02:24 Start Date: 07/Oct/21 02:24 Worklog Time Spent: 10m Work Description: tasanuma commented on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-937392105 Merged it. Thanks for your contribution, @virajjasani. Thanks for your review, @amahussein. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 661282) Time Spent: 4h 40m (was: 4.5h) > Replace Guava VisibleForTesting by Hadoop's own annotation in > hadoop-common-project modules > --- > > Key: HADOOP-17952 > URL: https://issues.apache.org/jira/browse/HADOOP-17952 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > Time Spent: 4h 40m > Remaining Estimate: 0h > -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tasanuma commented on pull request #3503: HADOOP-17952. Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
tasanuma commented on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-937392105 Merged it. Thanks for your contribution, @virajjasani. Thanks for your review, @amahussein. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17952) Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
[ https://issues.apache.org/jira/browse/HADOOP-17952?focusedWorklogId=661281=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-661281 ] ASF GitHub Bot logged work on HADOOP-17952: --- Author: ASF GitHub Bot Created on: 07/Oct/21 02:23 Start Date: 07/Oct/21 02:23 Worklog Time Spent: 10m Work Description: tasanuma merged pull request #3503: URL: https://github.com/apache/hadoop/pull/3503 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 661281) Time Spent: 4.5h (was: 4h 20m) > Replace Guava VisibleForTesting by Hadoop's own annotation in > hadoop-common-project modules > --- > > Key: HADOOP-17952 > URL: https://issues.apache.org/jira/browse/HADOOP-17952 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > Time Spent: 4.5h > Remaining Estimate: 0h > -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tasanuma merged pull request #3503: HADOOP-17952. Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
tasanuma merged pull request #3503: URL: https://github.com/apache/hadoop/pull/3503 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] symious commented on a change in pull request #3524: HDFS-16257. Set initialCapacity for guava cache to solve performance issue
symious commented on a change in pull request #3524: URL: https://github.com/apache/hadoop/pull/3524#discussion_r72379 ## File path: hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/resolver/MountTableResolver.java ## @@ -138,6 +138,8 @@ public MountTableResolver(Configuration conf, Router routerService, FEDERATION_MOUNT_TABLE_MAX_CACHE_SIZE, FEDERATION_MOUNT_TABLE_MAX_CACHE_SIZE_DEFAULT); this.locationCache = CacheBuilder.newBuilder() + // To warkaround guava bug https://github.com/google/guava/issues/1055 Review comment: Updated, please help to check. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17929) implement non-guava Precondition checkArgument
[ https://issues.apache.org/jira/browse/HADOOP-17929?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Takanobu Asanuma updated HADOOP-17929: -- Fix Version/s: 3.2.4 > implement non-guava Precondition checkArgument > -- > > Key: HADOOP-17929 > URL: https://issues.apache.org/jira/browse/HADOOP-17929 > Project: Hadoop Common > Issue Type: Sub-task >Affects Versions: 3.4.0, 3.2.3, 3.3.2 >Reporter: Ahmed Hussein >Assignee: Ahmed Hussein >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.2, 3.2.4 > > Time Spent: 1h 40m > Remaining Estimate: 0h > > As part In order to replace Guava Preconditions, we need to implement our own > versions of the API. > This Jira is to add the implementation {{checkArgument}} to the existing > class {{org.apache.hadoop.util.Preconditions}} > +The plan is as follows+ > * implement {{org.apache.hadoop.util.Preconditions.checkArgument}} with the > minimum set of interface used in the current hadoop repo. > * we can replace {{guava.Preconditions}} by > {{org.apache.hadoop.util.Preconditions}} once all the interfaces have been > implemented. > * We need the change to be easily to be backported in 3.x. > A previous jira HADOOP-17126 was created to replace CheckNotNull. > HADOOP-17930 is created to implement checkState. > CC: [~ste...@apache.org], [~vjasani] -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17929) implement non-guava Precondition checkArgument
[ https://issues.apache.org/jira/browse/HADOOP-17929?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17425317#comment-17425317 ] Takanobu Asanuma commented on HADOOP-17929: --- Cherry-picked to branch-3.2. > implement non-guava Precondition checkArgument > -- > > Key: HADOOP-17929 > URL: https://issues.apache.org/jira/browse/HADOOP-17929 > Project: Hadoop Common > Issue Type: Sub-task >Affects Versions: 3.4.0, 3.2.3, 3.3.2 >Reporter: Ahmed Hussein >Assignee: Ahmed Hussein >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.2 > > Time Spent: 1h 40m > Remaining Estimate: 0h > > As part In order to replace Guava Preconditions, we need to implement our own > versions of the API. > This Jira is to add the implementation {{checkArgument}} to the existing > class {{org.apache.hadoop.util.Preconditions}} > +The plan is as follows+ > * implement {{org.apache.hadoop.util.Preconditions.checkArgument}} with the > minimum set of interface used in the current hadoop repo. > * we can replace {{guava.Preconditions}} by > {{org.apache.hadoop.util.Preconditions}} once all the interfaces have been > implemented. > * We need the change to be easily to be backported in 3.x. > A previous jira HADOOP-17126 was created to replace CheckNotNull. > HADOOP-17930 is created to implement checkState. > CC: [~ste...@apache.org], [~vjasani] -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Resolved] (HADOOP-17930) implement non-guava Precondition checkState
[ https://issues.apache.org/jira/browse/HADOOP-17930?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Takanobu Asanuma resolved HADOOP-17930. --- Fix Version/s: 3.2.4 3.3.2 3.4.0 Resolution: Fixed > implement non-guava Precondition checkState > --- > > Key: HADOOP-17930 > URL: https://issues.apache.org/jira/browse/HADOOP-17930 > Project: Hadoop Common > Issue Type: Sub-task >Affects Versions: 3.4.0, 3.2.3, 3.3.2 >Reporter: Ahmed Hussein >Assignee: Ahmed Hussein >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.2, 3.2.4 > > Time Spent: 1h > Remaining Estimate: 0h > > As part In order to replace Guava Preconditions, we need to implement our own > versions of the API. > This Jira is to add the implementation {{checkState}} to the existing class > {{org.apache.hadoop.util.Preconditions}} > +The plan is as follows+ > * implement {{org.apache.hadoop.util.Preconditions.checkState}} with the > minimum set of interface used in the current hadoop repo. > * we can replace {{guava.Preconditions}} by > {{org.apache.hadoop.util.Preconditions}} once all the interfaces have been > implemented (both this jira and HADOOP-17929 are complete). > * We need the change to be easily to be backported in 3.x. > previous jiras: > * HADOOP-17126 was created to implement CheckNotNull. > * HADOOP-17929 implementing checkArgument. > CC: [~ste...@apache.org], [~vjasani] -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17950) Provide replacement for deprecated APIs of commons-io IOUtils
[ https://issues.apache.org/jira/browse/HADOOP-17950?focusedWorklogId=661277=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-661277 ] ASF GitHub Bot logged work on HADOOP-17950: --- Author: ASF GitHub Bot Created on: 07/Oct/21 01:58 Start Date: 07/Oct/21 01:58 Worklog Time Spent: 10m Work Description: aajisaka merged pull request #3515: URL: https://github.com/apache/hadoop/pull/3515 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 661277) Time Spent: 1h 40m (was: 1.5h) > Provide replacement for deprecated APIs of commons-io IOUtils > - > > Key: HADOOP-17950 > URL: https://issues.apache.org/jira/browse/HADOOP-17950 > Project: Hadoop Common > Issue Type: Task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > Time Spent: 1h 40m > Remaining Estimate: 0h > > Replace deprecated API usage of commons-io IOUtils after we have upgraded > commons-io as part of HADOOP-17683. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka merged pull request #3515: HADOOP-17950. Provide replacement for deprecated APIs of commons-io IOUtils
aajisaka merged pull request #3515: URL: https://github.com/apache/hadoop/pull/3515 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17950) Provide replacement for deprecated APIs of commons-io IOUtils
[ https://issues.apache.org/jira/browse/HADOOP-17950?focusedWorklogId=661275=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-661275 ] ASF GitHub Bot logged work on HADOOP-17950: --- Author: ASF GitHub Bot Created on: 07/Oct/21 01:57 Start Date: 07/Oct/21 01:57 Worklog Time Spent: 10m Work Description: aajisaka commented on pull request #3515: URL: https://github.com/apache/hadoop/pull/3515#issuecomment-937381828 > For trunk, shall we replace all default encoding usages (in addition to what we have in this PR) and replace all with UTF-8? Yes. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 661275) Time Spent: 1.5h (was: 1h 20m) > Provide replacement for deprecated APIs of commons-io IOUtils > - > > Key: HADOOP-17950 > URL: https://issues.apache.org/jira/browse/HADOOP-17950 > Project: Hadoop Common > Issue Type: Task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > Time Spent: 1.5h > Remaining Estimate: 0h > > Replace deprecated API usage of commons-io IOUtils after we have upgraded > commons-io as part of HADOOP-17683. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on pull request #3515: HADOOP-17950. Provide replacement for deprecated APIs of commons-io IOUtils
aajisaka commented on pull request #3515: URL: https://github.com/apache/hadoop/pull/3515#issuecomment-937381828 > For trunk, shall we replace all default encoding usages (in addition to what we have in this PR) and replace all with UTF-8? Yes. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17930) implement non-guava Precondition checkState
[ https://issues.apache.org/jira/browse/HADOOP-17930?focusedWorklogId=661274=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-661274 ] ASF GitHub Bot logged work on HADOOP-17930: --- Author: ASF GitHub Bot Created on: 07/Oct/21 01:55 Start Date: 07/Oct/21 01:55 Worklog Time Spent: 10m Work Description: tasanuma commented on pull request #3522: URL: https://github.com/apache/hadoop/pull/3522#issuecomment-937381082 Merged it. Thanks for your contribution, @amahussein. Thanks for your review, @virajjasani. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 661274) Time Spent: 1h (was: 50m) > implement non-guava Precondition checkState > --- > > Key: HADOOP-17930 > URL: https://issues.apache.org/jira/browse/HADOOP-17930 > Project: Hadoop Common > Issue Type: Sub-task >Affects Versions: 3.4.0, 3.2.3, 3.3.2 >Reporter: Ahmed Hussein >Assignee: Ahmed Hussein >Priority: Major > Labels: pull-request-available > Time Spent: 1h > Remaining Estimate: 0h > > As part In order to replace Guava Preconditions, we need to implement our own > versions of the API. > This Jira is to add the implementation {{checkState}} to the existing class > {{org.apache.hadoop.util.Preconditions}} > +The plan is as follows+ > * implement {{org.apache.hadoop.util.Preconditions.checkState}} with the > minimum set of interface used in the current hadoop repo. > * we can replace {{guava.Preconditions}} by > {{org.apache.hadoop.util.Preconditions}} once all the interfaces have been > implemented (both this jira and HADOOP-17929 are complete). > * We need the change to be easily to be backported in 3.x. > previous jiras: > * HADOOP-17126 was created to implement CheckNotNull. > * HADOOP-17929 implementing checkArgument. > CC: [~ste...@apache.org], [~vjasani] -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17930) implement non-guava Precondition checkState
[ https://issues.apache.org/jira/browse/HADOOP-17930?focusedWorklogId=661273=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-661273 ] ASF GitHub Bot logged work on HADOOP-17930: --- Author: ASF GitHub Bot Created on: 07/Oct/21 01:55 Start Date: 07/Oct/21 01:55 Worklog Time Spent: 10m Work Description: tasanuma merged pull request #3522: URL: https://github.com/apache/hadoop/pull/3522 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 661273) Time Spent: 50m (was: 40m) > implement non-guava Precondition checkState > --- > > Key: HADOOP-17930 > URL: https://issues.apache.org/jira/browse/HADOOP-17930 > Project: Hadoop Common > Issue Type: Sub-task >Affects Versions: 3.4.0, 3.2.3, 3.3.2 >Reporter: Ahmed Hussein >Assignee: Ahmed Hussein >Priority: Major > Labels: pull-request-available > Time Spent: 50m > Remaining Estimate: 0h > > As part In order to replace Guava Preconditions, we need to implement our own > versions of the API. > This Jira is to add the implementation {{checkState}} to the existing class > {{org.apache.hadoop.util.Preconditions}} > +The plan is as follows+ > * implement {{org.apache.hadoop.util.Preconditions.checkState}} with the > minimum set of interface used in the current hadoop repo. > * we can replace {{guava.Preconditions}} by > {{org.apache.hadoop.util.Preconditions}} once all the interfaces have been > implemented (both this jira and HADOOP-17929 are complete). > * We need the change to be easily to be backported in 3.x. > previous jiras: > * HADOOP-17126 was created to implement CheckNotNull. > * HADOOP-17929 implementing checkArgument. > CC: [~ste...@apache.org], [~vjasani] -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tasanuma commented on pull request #3522: HADOOP-17930. implement non-guava Precondition checkState
tasanuma commented on pull request #3522: URL: https://github.com/apache/hadoop/pull/3522#issuecomment-937381082 Merged it. Thanks for your contribution, @amahussein. Thanks for your review, @virajjasani. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] tasanuma merged pull request #3522: HADOOP-17930. implement non-guava Precondition checkState
tasanuma merged pull request #3522: URL: https://github.com/apache/hadoop/pull/3522 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17922) Lookup old S3 encryption configs for JCEKS
[ https://issues.apache.org/jira/browse/HADOOP-17922?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17425306#comment-17425306 ] Dongjoon Hyun commented on HADOOP-17922: This seems to land at branch-3.3. Could you update the Fix Version, please? > Lookup old S3 encryption configs for JCEKS > -- > > Key: HADOOP-17922 > URL: https://issues.apache.org/jira/browse/HADOOP-17922 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.4.0 >Reporter: Mehakmeet Singh >Assignee: Mehakmeet Singh >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0 > > Time Spent: 8h > Remaining Estimate: 0h > > HADOOP-17871 introduces new set of S3 encryption configs which are replaced > by old property names during look-up. We need to look-up for both the > properties since either could be set in a JCEKS file. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17817) HADOOP-17817. S3A to raise IOE if both S3-CSE and S3Guard enabled
[ https://issues.apache.org/jira/browse/HADOOP-17817?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17425304#comment-17425304 ] Dongjoon Hyun commented on HADOOP-17817: This landed at branch-3.3. Could you update the Fix Version, please? > HADOOP-17817. S3A to raise IOE if both S3-CSE and S3Guard enabled > - > > Key: HADOOP-17817 > URL: https://issues.apache.org/jira/browse/HADOOP-17817 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.4.0 >Reporter: Mehakmeet Singh >Assignee: Mehakmeet Singh >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0 > > Time Spent: 2.5h > Remaining Estimate: 0h > > Throw an exception if S3Guard and S3 Client-side encryption are enabled on a > bucket. Follow-up to HADOOP-13887. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17871) S3A CSE: minor tuning
[ https://issues.apache.org/jira/browse/HADOOP-17871?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17425305#comment-17425305 ] Dongjoon Hyun commented on HADOOP-17871: This landed at branch-3.3. Do we need to revise the Fix Version? > S3A CSE: minor tuning > - > > Key: HADOOP-17871 > URL: https://issues.apache.org/jira/browse/HADOOP-17871 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Assignee: Mehakmeet Singh >Priority: Minor > Labels: pull-request-available > Fix For: 3.4.0 > > Time Spent: 3.5h > Remaining Estimate: 0h > > Some minor tuning to the CSE encryption support before backporting to 3.3.x > and so shipping this year > * LogExactlyOnce an "please ignore the warning" message to a new log > ("org.apache.hadoop.fs.s3a.encryption") which can be set to ERROR if you get > bored of the message. > * Extend testing_s3a.md and the SDK upgrade runbook: test CSE always > * change property name of encryption key (maybe: fs.s3a.encryption) and add > mapping in S3AFileSystem.addDeprecatedKeys ... docs will need updating too. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Comment Edited] (HADOOP-13887) Encrypt S3A data client-side with AWS SDK (S3-CSE)
[ https://issues.apache.org/jira/browse/HADOOP-13887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17425303#comment-17425303 ] Dongjoon Hyun edited comment on HADOOP-13887 at 10/7/21, 1:35 AM: -- This seems to land at branch-3.3. Could you update the Fix Version, please? was (Author: dongjoon): This seems to land at branch-3.2. Could you update the Fix Version, please? > Encrypt S3A data client-side with AWS SDK (S3-CSE) > -- > > Key: HADOOP-13887 > URL: https://issues.apache.org/jira/browse/HADOOP-13887 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 2.8.0 >Reporter: Jeeyoung Kim >Assignee: Mehakmeet Singh >Priority: Minor > Labels: pull-request-available > Fix For: 3.4.0 > > Attachments: HADOOP-13887-002.patch, HADOOP-13887-007.patch, > HADOOP-13887-branch-2-003.patch, HADOOP-13897-branch-2-004.patch, > HADOOP-13897-branch-2-005.patch, HADOOP-13897-branch-2-006.patch, > HADOOP-13897-branch-2-008.patch, HADOOP-13897-branch-2-009.patch, > HADOOP-13897-branch-2-010.patch, HADOOP-13897-branch-2-012.patch, > HADOOP-13897-branch-2-014.patch, HADOOP-13897-trunk-011.patch, > HADOOP-13897-trunk-013.patch, HADOOP-14171-001.patch, S3-CSE Proposal.pdf > > Time Spent: 14h > Remaining Estimate: 0h > > Expose the client-side encryption option documented in Amazon S3 > documentation - > http://docs.aws.amazon.com/AmazonS3/latest/dev/UsingClientSideEncryption.html > When backporting, include HADOOP-17817 -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-13887) Encrypt S3A data client-side with AWS SDK (S3-CSE)
[ https://issues.apache.org/jira/browse/HADOOP-13887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17425303#comment-17425303 ] Dongjoon Hyun commented on HADOOP-13887: This seems to land at branch-3.2. Could you update the Fix Version, please? > Encrypt S3A data client-side with AWS SDK (S3-CSE) > -- > > Key: HADOOP-13887 > URL: https://issues.apache.org/jira/browse/HADOOP-13887 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 2.8.0 >Reporter: Jeeyoung Kim >Assignee: Mehakmeet Singh >Priority: Minor > Labels: pull-request-available > Fix For: 3.4.0 > > Attachments: HADOOP-13887-002.patch, HADOOP-13887-007.patch, > HADOOP-13887-branch-2-003.patch, HADOOP-13897-branch-2-004.patch, > HADOOP-13897-branch-2-005.patch, HADOOP-13897-branch-2-006.patch, > HADOOP-13897-branch-2-008.patch, HADOOP-13897-branch-2-009.patch, > HADOOP-13897-branch-2-010.patch, HADOOP-13897-branch-2-012.patch, > HADOOP-13897-branch-2-014.patch, HADOOP-13897-trunk-011.patch, > HADOOP-13897-trunk-013.patch, HADOOP-14171-001.patch, S3-CSE Proposal.pdf > > Time Spent: 14h > Remaining Estimate: 0h > > Expose the client-side encryption option documented in Amazon S3 > documentation - > http://docs.aws.amazon.com/AmazonS3/latest/dev/UsingClientSideEncryption.html > When backporting, include HADOOP-17817 -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17934) NullPointerException when no HTTP response set on AbfsRestOperation
[ https://issues.apache.org/jira/browse/HADOOP-17934?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17425302#comment-17425302 ] Dongjoon Hyun commented on HADOOP-17934: Hi, [~elserj]. Maybe, `Fix Version` is `3.3.2`? > NullPointerException when no HTTP response set on AbfsRestOperation > --- > > Key: HADOOP-17934 > URL: https://issues.apache.org/jira/browse/HADOOP-17934 > Project: Hadoop Common > Issue Type: Bug > Components: fs/azure >Affects Versions: 3.3.1 >Reporter: Josh Elser >Assignee: Josh Elser >Priority: Major > Labels: pull-request-available > Fix For: 3.3.3 > > Time Spent: 3h 50m > Remaining Estimate: 0h > > Seen when running HBase 2.2 on top of ABFS with Hadoop 3.1ish: > {noformat} > Caused by: java.lang.NullPointerException > at > org.apache.hadoop.fs.azurebfs.services.AbfsClient.renameIdempotencyCheckOp(AbfsClient.java:382) > at > org.apache.hadoop.fs.azurebfs.services.AbfsClient.renamePath(AbfsClient.java:348) > at > org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.rename(AzureBlobFileSystemStore.java:722) > at > org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.rename(AzureBlobFileSystem.java:327) > at > org.apache.hadoop.fs.FilterFileSystem.rename(FilterFileSystem.java:249) > at > org.apache.hadoop.hbase.regionserver.HRegionFileSystem.rename(HRegionFileSystem.java:1115) > {noformat} > Digging in, it looks like the {{AbfsHttpOperation}} inside of > {{AbfsRestOperation}} may sometimes be null, but {{AbfsClient}} will try to > unwrap it (and read the status code from the HTTP call). I'm not sure why we > sometimes _get_ the null HttpOperation but it seems pretty straightforward to > not get the NPE. > HBase got wedged after this, but I'm not sure if it's because of this NPE or > (perhaps) we weren't getting any responses from ABFS itself (i.e. there was > some ABFS outage/unavailability or the node itself couldn't talk to ABFS). -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-15327) Upgrade MR ShuffleHandler to use Netty4
[ https://issues.apache.org/jira/browse/HADOOP-15327?focusedWorklogId=661258=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-661258 ] ASF GitHub Bot logged work on HADOOP-15327: --- Author: ASF GitHub Bot Created on: 07/Oct/21 00:23 Start Date: 07/Oct/21 00:23 Worklog Time Spent: 10m Work Description: jasonwzs commented on pull request #3259: URL: https://github.com/apache/hadoop/pull/3259#issuecomment-937343212 Is there any update on this PR? @szilard-nemeth We are also waiting for the change of using netty4 to replace netty3 due to its vulnerability issues. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 661258) Time Spent: 5h 10m (was: 5h) > Upgrade MR ShuffleHandler to use Netty4 > --- > > Key: HADOOP-15327 > URL: https://issues.apache.org/jira/browse/HADOOP-15327 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Xiaoyu Yao >Assignee: Szilard Nemeth >Priority: Major > Labels: pull-request-available > Attachments: HADOOP-15327.001.patch, HADOOP-15327.002.patch, > HADOOP-15327.003.patch, HADOOP-15327.004.patch, HADOOP-15327.005.patch, > HADOOP-15327.005.patch, > getMapOutputInfo_BlockingOperationException_awaitUninterruptibly.log, > testfailure-testMapFileAccess-emptyresponse.zip, > testfailure-testReduceFromPartialMem.zip > > Time Spent: 5h 10m > Remaining Estimate: 0h > > This way, we can remove the dependencies on the netty3 (jboss.netty) -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] jasonwzs commented on pull request #3259: HADOOP-15327. Upgrade MR ShuffleHandler to use Netty4
jasonwzs commented on pull request #3259: URL: https://github.com/apache/hadoop/pull/3259#issuecomment-937343212 Is there any update on this PR? @szilard-nemeth We are also waiting for the change of using netty4 to replace netty3 due to its vulnerability issues. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-15327) Upgrade MR ShuffleHandler to use Netty4
[ https://issues.apache.org/jira/browse/HADOOP-15327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17425285#comment-17425285 ] Jason Wen commented on HADOOP-15327: What's the current status of the PR? Will the fix be included in any upcoming release? > Upgrade MR ShuffleHandler to use Netty4 > --- > > Key: HADOOP-15327 > URL: https://issues.apache.org/jira/browse/HADOOP-15327 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Xiaoyu Yao >Assignee: Szilard Nemeth >Priority: Major > Labels: pull-request-available > Attachments: HADOOP-15327.001.patch, HADOOP-15327.002.patch, > HADOOP-15327.003.patch, HADOOP-15327.004.patch, HADOOP-15327.005.patch, > HADOOP-15327.005.patch, > getMapOutputInfo_BlockingOperationException_awaitUninterruptibly.log, > testfailure-testMapFileAccess-emptyresponse.zip, > testfailure-testReduceFromPartialMem.zip > > Time Spent: 5h > Remaining Estimate: 0h > > This way, we can remove the dependencies on the netty3 (jboss.netty) -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] bbeaudreault opened a new pull request #3527: HDFS-16262 Async refresh of cached locations in DFSInputStream
bbeaudreault opened a new pull request #3527: URL: https://github.com/apache/hadoop/pull/3527 ### Description of PR Refactor refreshing of cached block locations so that it happens as part of an async process, with rate limiting. Add the ability to limit to only refresh DFSInputStreams if necessary. This defaults to false to preserve backwards compatibility with the old behavior from https://issues.apache.org/jira/browse/HDFS-15119 See https://issues.apache.org/jira/browse/HDFS-16262 ### How was this patch tested? I added a new test class TestLocatedBlocksRefresher. I am in the process of deploying this internally on one of our hadoop-3.3 clusters, will report back. ### For code changes: - [x] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17952) Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
[ https://issues.apache.org/jira/browse/HADOOP-17952?focusedWorklogId=661151=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-661151 ] ASF GitHub Bot logged work on HADOOP-17952: --- Author: ASF GitHub Bot Created on: 06/Oct/21 19:47 Start Date: 06/Oct/21 19:47 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-937000831 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 13s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 3s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 20m 8s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 27m 9s | | trunk passed | | +1 :green_heart: | compile | 29m 48s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 25m 20s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 1m 40s | | trunk passed | | +1 :green_heart: | mvnsite | 4m 27s | | trunk passed | | +1 :green_heart: | javadoc | 3m 43s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 3m 49s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 7m 16s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 37s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 24s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 45s | | the patch passed | | +1 :green_heart: | compile | 28m 55s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 28m 55s | | the patch passed | | +1 :green_heart: | compile | 24m 50s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 24m 50s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 33s | | hadoop-common-project: The patch generated 0 new + 2501 unchanged - 3 fixed = 2501 total (was 2504) | | +1 :green_heart: | mvnsite | 4m 42s | | the patch passed | | +1 :green_heart: | xml | 0m 9s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 3m 52s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 4m 5s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 7m 50s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 47s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 3m 45s | | hadoop-auth in the patch passed. | | +1 :green_heart: | unit | 17m 11s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 0m 50s | | hadoop-nfs in the patch passed. | | +1 :green_heart: | unit | 3m 36s | | hadoop-kms in the patch passed. | | +1 :green_heart: | unit | 1m 15s | | hadoop-registry in the patch passed. | | +1 :green_heart: | asflicense | 0m 50s | | The patch does not generate ASF License warnings. | | | | 286m 54s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3503/10/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3503 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell xml spotbugs checkstyle | | uname | Linux 04a44df73f32 4.15.0-147-generic #151-Ubuntu SMP Fri Jun 18 19:21:19 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / fafbd50133e9e43569e92a3c9e694dfd29527be2 | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
[GitHub] [hadoop] hadoop-yetus commented on pull request #3503: HADOOP-17952. Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
hadoop-yetus commented on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-937000831 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 13s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 3s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 20m 8s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 27m 9s | | trunk passed | | +1 :green_heart: | compile | 29m 48s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 25m 20s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 1m 40s | | trunk passed | | +1 :green_heart: | mvnsite | 4m 27s | | trunk passed | | +1 :green_heart: | javadoc | 3m 43s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 3m 49s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 7m 16s | | trunk passed | | +1 :green_heart: | shadedclient | 25m 37s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 24s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 45s | | the patch passed | | +1 :green_heart: | compile | 28m 55s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 28m 55s | | the patch passed | | +1 :green_heart: | compile | 24m 50s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 24m 50s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 33s | | hadoop-common-project: The patch generated 0 new + 2501 unchanged - 3 fixed = 2501 total (was 2504) | | +1 :green_heart: | mvnsite | 4m 42s | | the patch passed | | +1 :green_heart: | xml | 0m 9s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 3m 52s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 4m 5s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 7m 50s | | the patch passed | | +1 :green_heart: | shadedclient | 25m 47s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 3m 45s | | hadoop-auth in the patch passed. | | +1 :green_heart: | unit | 17m 11s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 0m 50s | | hadoop-nfs in the patch passed. | | +1 :green_heart: | unit | 3m 36s | | hadoop-kms in the patch passed. | | +1 :green_heart: | unit | 1m 15s | | hadoop-registry in the patch passed. | | +1 :green_heart: | asflicense | 0m 50s | | The patch does not generate ASF License warnings. | | | | 286m 54s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3503/10/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3503 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell xml spotbugs checkstyle | | uname | Linux 04a44df73f32 4.15.0-147-generic #151-Ubuntu SMP Fri Jun 18 19:21:19 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / fafbd50133e9e43569e92a3c9e694dfd29527be2 | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3503/10/testReport/ | | Max. process+thread count | 1239 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-auth
[GitHub] [hadoop] goiri commented on a change in pull request #3524: HDFS-16257. Set initialCapacity for guava cache to solve performance issue
goiri commented on a change in pull request #3524: URL: https://github.com/apache/hadoop/pull/3524#discussion_r723600325 ## File path: hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/resolver/MountTableResolver.java ## @@ -138,6 +138,8 @@ public MountTableResolver(Configuration conf, Router routerService, FEDERATION_MOUNT_TABLE_MAX_CACHE_SIZE, FEDERATION_MOUNT_TABLE_MAX_CACHE_SIZE_DEFAULT); this.locationCache = CacheBuilder.newBuilder() + // To warkaround guava bug https://github.com/google/guava/issues/1055 Review comment: mention in what version this is fixed ## File path: hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/resolver/MountTableResolver.java ## @@ -138,6 +138,8 @@ public MountTableResolver(Configuration conf, Router routerService, FEDERATION_MOUNT_TABLE_MAX_CACHE_SIZE, FEDERATION_MOUNT_TABLE_MAX_CACHE_SIZE_DEFAULT); this.locationCache = CacheBuilder.newBuilder() + // To warkaround guava bug https://github.com/google/guava/issues/1055 Review comment: work around -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3526: YARN-6862. Nodemanager resource usage metrics should not show negativ…
hadoop-yetus commented on pull request #3526: URL: https://github.com/apache/hadoop/pull/3526#issuecomment-936918368 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 56s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 1s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 37m 51s | | trunk passed | | +1 :green_heart: | compile | 1m 51s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 1m 39s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 0m 39s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 58s | | trunk passed | | +1 :green_heart: | javadoc | 0m 45s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 38s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 1m 45s | | trunk passed | | +1 :green_heart: | shadedclient | 21m 2s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 38s | | the patch passed | | +1 :green_heart: | compile | 1m 22s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 1m 22s | | the patch passed | | +1 :green_heart: | compile | 1m 17s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 1m 17s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 24s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 37s | | the patch passed | | +1 :green_heart: | javadoc | 0m 31s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 29s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 1m 25s | | the patch passed | | +1 :green_heart: | shadedclient | 19m 19s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 23m 1s | | hadoop-yarn-server-nodemanager in the patch passed. | | +1 :green_heart: | asflicense | 0m 33s | | The patch does not generate ASF License warnings. | | | | 118m 16s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3526/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3526 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 042872e741d3 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 930d0eb1c456c6445a5597a343eaeed6ce4e7b98 | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3526/2/testReport/ | | Max. process+thread count | 625 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager U: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3526/2/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this
[jira] [Work logged] (HADOOP-17952) Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
[ https://issues.apache.org/jira/browse/HADOOP-17952?focusedWorklogId=661131=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-661131 ] ASF GitHub Bot logged work on HADOOP-17952: --- Author: ASF GitHub Bot Created on: 06/Oct/21 18:59 Start Date: 06/Oct/21 18:59 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-936912115 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 48s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 4s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 20m 21s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 20m 56s | | trunk passed | | +1 :green_heart: | compile | 21m 30s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 18m 38s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 1m 34s | | trunk passed | | +1 :green_heart: | mvnsite | 4m 35s | | trunk passed | | +1 :green_heart: | javadoc | 3m 59s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 4m 24s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 6m 9s | | trunk passed | | +1 :green_heart: | shadedclient | 19m 32s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 28s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 25s | | the patch passed | | +1 :green_heart: | compile | 20m 37s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 20m 37s | | the patch passed | | +1 :green_heart: | compile | 18m 39s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 18m 39s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 31s | | hadoop-common-project: The patch generated 0 new + 2499 unchanged - 3 fixed = 2499 total (was 2502) | | +1 :green_heart: | mvnsite | 4m 33s | | the patch passed | | +1 :green_heart: | xml | 0m 7s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 3m 52s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 4m 18s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 7m 5s | | the patch passed | | +1 :green_heart: | shadedclient | 19m 50s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 3m 23s | | hadoop-auth in the patch passed. | | +1 :green_heart: | unit | 17m 23s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 1m 0s | | hadoop-nfs in the patch passed. | | +1 :green_heart: | unit | 3m 42s | | hadoop-kms in the patch passed. | | +1 :green_heart: | unit | 1m 23s | | hadoop-registry in the patch passed. | | +1 :green_heart: | asflicense | 0m 59s | | The patch does not generate ASF License warnings. | | | | 238m 25s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3503/11/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3503 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell xml spotbugs checkstyle | | uname | Linux 42739dd6d9d8 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / fafbd50133e9e43569e92a3c9e694dfd29527be2 | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | |
[GitHub] [hadoop] hadoop-yetus commented on pull request #3503: HADOOP-17952. Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
hadoop-yetus commented on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-936912115 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 48s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 4s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 20m 21s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 20m 56s | | trunk passed | | +1 :green_heart: | compile | 21m 30s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 18m 38s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 1m 34s | | trunk passed | | +1 :green_heart: | mvnsite | 4m 35s | | trunk passed | | +1 :green_heart: | javadoc | 3m 59s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 4m 24s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 6m 9s | | trunk passed | | +1 :green_heart: | shadedclient | 19m 32s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 28s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 25s | | the patch passed | | +1 :green_heart: | compile | 20m 37s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 20m 37s | | the patch passed | | +1 :green_heart: | compile | 18m 39s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 18m 39s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 31s | | hadoop-common-project: The patch generated 0 new + 2499 unchanged - 3 fixed = 2499 total (was 2502) | | +1 :green_heart: | mvnsite | 4m 33s | | the patch passed | | +1 :green_heart: | xml | 0m 7s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 3m 52s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 4m 18s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 7m 5s | | the patch passed | | +1 :green_heart: | shadedclient | 19m 50s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 3m 23s | | hadoop-auth in the patch passed. | | +1 :green_heart: | unit | 17m 23s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 1m 0s | | hadoop-nfs in the patch passed. | | +1 :green_heart: | unit | 3m 42s | | hadoop-kms in the patch passed. | | +1 :green_heart: | unit | 1m 23s | | hadoop-registry in the patch passed. | | +1 :green_heart: | asflicense | 0m 59s | | The patch does not generate ASF License warnings. | | | | 238m 25s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3503/11/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3503 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell xml spotbugs checkstyle | | uname | Linux 42739dd6d9d8 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / fafbd50133e9e43569e92a3c9e694dfd29527be2 | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3503/11/testReport/ | | Max. process+thread count | 1251 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-auth
[GitHub] [hadoop] hadoop-yetus commented on pull request #3526: YARN-6862. Nodemanager resource usage metrics should not show negativ…
hadoop-yetus commented on pull request #3526: URL: https://github.com/apache/hadoop/pull/3526#issuecomment-936896998 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 8s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 1s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 31m 34s | | trunk passed | | +1 :green_heart: | compile | 1m 29s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 1m 25s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 0m 35s | | trunk passed | | +1 :green_heart: | mvnsite | 0m 48s | | trunk passed | | +1 :green_heart: | javadoc | 0m 42s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 36s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 1m 25s | | trunk passed | | +1 :green_heart: | shadedclient | 19m 51s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 37s | | the patch passed | | +1 :green_heart: | compile | 1m 21s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 1m 21s | | the patch passed | | +1 :green_heart: | compile | 1m 15s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 1m 15s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 25s | | the patch passed | | +1 :green_heart: | mvnsite | 0m 37s | | the patch passed | | +1 :green_heart: | javadoc | 0m 29s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 28s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 1m 27s | | the patch passed | | +1 :green_heart: | shadedclient | 19m 11s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 23m 13s | | hadoop-yarn-server-nodemanager in the patch passed. | | +1 :green_heart: | asflicense | 0m 34s | | The patch does not generate ASF License warnings. | | | | 109m 40s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3526/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3526 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux d2f61928dea3 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 930d0eb1c456c6445a5597a343eaeed6ce4e7b98 | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3526/1/testReport/ | | Max. process+thread count | 724 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager U: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3526/1/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this
[jira] [Commented] (HADOOP-16532) TestViewFsTrash uses home directory of real fs; brittle
[ https://issues.apache.org/jira/browse/HADOOP-16532?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17425137#comment-17425137 ] Konstantin Shvachko commented on HADOOP-16532: -- Well, Unit tests are hacks by definition. We mock variables and methods, inject faults, set weird config values, etc., to achieve the desired "faulty" behavior. You can try using {{setWorkingDirectory()}} is this feels less hacky to you. > TestViewFsTrash uses home directory of real fs; brittle > --- > > Key: HADOOP-16532 > URL: https://issues.apache.org/jira/browse/HADOOP-16532 > Project: Hadoop Common > Issue Type: Bug > Components: fs, test >Affects Versions: 3.3.0 >Reporter: Steve Loughran >Assignee: Xing Lin >Priority: Minor > Labels: pull-request-available > Time Spent: 50m > Remaining Estimate: 0h > > the test {{TestViewFsTrash}} uses the .Trash directory under the current > user's home dir so > * fails in test setups which block writing to it (jenkins) > * fails when users have real trash in there > * may fail if there are parallel test runs. > the home dir should be under some test path of the build. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17953) S3A: ITestS3AFileContextStatistics test to lookup global or per-bucket configuration for encryption algorithm
[ https://issues.apache.org/jira/browse/HADOOP-17953?focusedWorklogId=661081=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-661081 ] ASF GitHub Bot logged work on HADOOP-17953: --- Author: ASF GitHub Bot Created on: 06/Oct/21 17:11 Start Date: 06/Oct/21 17:11 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3525: URL: https://github.com/apache/hadoop/pull/3525#issuecomment-936714766 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 44s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 3 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 27m 0s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 21m 7s | | trunk passed | | +1 :green_heart: | compile | 21m 23s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 18m 38s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 3m 37s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 33s | | trunk passed | | +1 :green_heart: | javadoc | 1m 50s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 2m 31s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 3m 45s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 14s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 27s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 32s | | the patch passed | | +1 :green_heart: | compile | 20m 42s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 20m 42s | | the patch passed | | +1 :green_heart: | compile | 18m 36s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 18m 36s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 3m 35s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 34s | | the patch passed | | +1 :green_heart: | javadoc | 1m 48s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 2m 28s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 4m 5s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 8s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 17m 27s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 2m 30s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 59s | | The patch does not generate ASF License warnings. | | | | 223m 48s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3525/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3525 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 7c2ada2fd3d7 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 20980b41a3f923bd251a6f2ac1f8cc5a269b3d85 | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3525/1/testReport/ | | Max. process+thread count | 1266 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws U: . |
[GitHub] [hadoop] hadoop-yetus commented on pull request #3525: HADOOP-17953. S3A: Tests to lookup global or per-bucket configuration for encryption algorithm
hadoop-yetus commented on pull request #3525: URL: https://github.com/apache/hadoop/pull/3525#issuecomment-936714766 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 44s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 3 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 27m 0s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 21m 7s | | trunk passed | | +1 :green_heart: | compile | 21m 23s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 18m 38s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 3m 37s | | trunk passed | | +1 :green_heart: | mvnsite | 2m 33s | | trunk passed | | +1 :green_heart: | javadoc | 1m 50s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 2m 31s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 3m 45s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 14s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 27s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 32s | | the patch passed | | +1 :green_heart: | compile | 20m 42s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 20m 42s | | the patch passed | | +1 :green_heart: | compile | 18m 36s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 18m 36s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 3m 35s | | the patch passed | | +1 :green_heart: | mvnsite | 2m 34s | | the patch passed | | +1 :green_heart: | javadoc | 1m 48s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 2m 28s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 4m 5s | | the patch passed | | +1 :green_heart: | shadedclient | 20m 8s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 17m 27s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 2m 30s | | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 59s | | The patch does not generate ASF License warnings. | | | | 223m 48s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3525/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3525 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux 7c2ada2fd3d7 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / 20980b41a3f923bd251a6f2ac1f8cc5a269b3d85 | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3525/1/testReport/ | | Max. process+thread count | 1266 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws U: . | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3525/1/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above
[GitHub] [hadoop] brumi1024 opened a new pull request #3526: YARN-6862. Nodemanager resource usage metrics should not show negativ…
brumi1024 opened a new pull request #3526: URL: https://github.com/apache/hadoop/pull/3526 …e values. ### Description of PR ### How was this patch tested? ### For code changes: - [ ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')? - [ ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, `NOTICE-binary` files? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17371) Bump Jetty to the latest version 9.4.35
[ https://issues.apache.org/jira/browse/HADOOP-17371?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17425093#comment-17425093 ] Brahma Reddy Battula commented on HADOOP-17371: --- {quote}Can we backport this to branch-3.2? {quote} Sure. [~weichiu] can we cherry-pick to branch-3.2..? > Bump Jetty to the latest version 9.4.35 > --- > > Key: HADOOP-17371 > URL: https://issues.apache.org/jira/browse/HADOOP-17371 > Project: Hadoop Common > Issue Type: Improvement >Affects Versions: 3.3.1, 3.4.0, 3.2.3 >Reporter: Wei-Chiu Chuang >Assignee: Wei-Chiu Chuang >Priority: Major > Labels: pull-request-available > Fix For: 3.3.1, 3.4.0 > > Time Spent: 5h 10m > Remaining Estimate: 0h > > The Hadoop 3 branches are on 9.4.20. We should update to the latest version: > 9.4.34 -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17834) Bump aliyun-sdk-oss to 3.13.0
[ https://issues.apache.org/jira/browse/HADOOP-17834?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17425092#comment-17425092 ] Brahma Reddy Battula commented on HADOOP-17834: --- {quote}Can we backport this to branch-3.2? {quote} Sure. [~aajisaka] please let me know your thought on this. > Bump aliyun-sdk-oss to 3.13.0 > - > > Key: HADOOP-17834 > URL: https://issues.apache.org/jira/browse/HADOOP-17834 > Project: Hadoop Common > Issue Type: Task >Reporter: Siyao Meng >Assignee: Siyao Meng >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.2 > > Time Spent: 1h > Remaining Estimate: 0h > > Bump aliyun-sdk-oss to 3.13.0 in order to remove transitive dependency on > jdom 1.1. > Ref: > https://issues.apache.org/jira/browse/HADOOP-17820?focusedCommentId=17390206=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17390206. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17236) Bump up snakeyaml to 1.26 to mitigate CVE-2017-18640
[ https://issues.apache.org/jira/browse/HADOOP-17236?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17425091#comment-17425091 ] Brahma Reddy Battula commented on HADOOP-17236: --- [~ananysin] Sure, thanks..will create Jira to cherry-pick this. > Bump up snakeyaml to 1.26 to mitigate CVE-2017-18640 > > > Key: HADOOP-17236 > URL: https://issues.apache.org/jira/browse/HADOOP-17236 > Project: Hadoop Common > Issue Type: Bug >Reporter: Brahma Reddy Battula >Assignee: Brahma Reddy Battula >Priority: Major > Fix For: 3.3.1, 3.4.0 > > Attachments: HADOOP-17236-001-tempToRun.patch, HADOOP-17236-001.patch > > > Bump up snakeyaml to 1.26 to mitigate CVE-2017-18640 -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17254) Upgrade hbase to 1.4.13 on branch-2.10
[ https://issues.apache.org/jira/browse/HADOOP-17254?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17425083#comment-17425083 ] Brahma Reddy Battula commented on HADOOP-17254: --- Thanks [~iwasakims] > Upgrade hbase to 1.4.13 on branch-2.10 > -- > > Key: HADOOP-17254 > URL: https://issues.apache.org/jira/browse/HADOOP-17254 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Masatake Iwasaki >Assignee: Masatake Iwasaki >Priority: Major > Labels: pull-request-available > Fix For: 2.10.1 > > Time Spent: 50m > Remaining Estimate: 0h > > hbase.version must be updated to address CVE-2018-8025 on branch-2.10. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17953) S3A: ITestS3AFileContextStatistics test to lookup global or per-bucket configuration for encryption algorithm
[ https://issues.apache.org/jira/browse/HADOOP-17953?focusedWorklogId=661048=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-661048 ] ASF GitHub Bot logged work on HADOOP-17953: --- Author: ASF GitHub Bot Created on: 06/Oct/21 16:31 Start Date: 06/Oct/21 16:31 Worklog Time Spent: 10m Work Description: mehakmeet commented on pull request #3525: URL: https://github.com/apache/hadoop/pull/3525#issuecomment-936631689 CC : @steveloughran -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 661048) Time Spent: 20m (was: 10m) > S3A: ITestS3AFileContextStatistics test to lookup global or per-bucket > configuration for encryption algorithm > - > > Key: HADOOP-17953 > URL: https://issues.apache.org/jira/browse/HADOOP-17953 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.4.0 >Reporter: Mehakmeet Singh >Assignee: Mehakmeet Singh >Priority: Major > Labels: pull-request-available > Time Spent: 20m > Remaining Estimate: 0h > > ITestS3AFileContextStatistics uses the conf.get() method to get the global > configuration for encryption algorithm and keys, but the per-bucket > configuration would be ignored in this case. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mehakmeet commented on pull request #3525: HADOOP-17953. S3A: Tests to lookup global or per-bucket configuration for encryption algorithm
mehakmeet commented on pull request #3525: URL: https://github.com/apache/hadoop/pull/3525#issuecomment-936631689 CC : @steveloughran -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] goiri merged pull request #3518: HDFS-16254. Cleanup protobuf on exit of hdfs_allowSnapshot
goiri merged pull request #3518: URL: https://github.com/apache/hadoop/pull/3518 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #3459: YARN-10962. Do not extend from CapacitySchedulerTestBase when not needed.
hadoop-yetus commented on pull request #3459: URL: https://github.com/apache/hadoop/pull/3459#issuecomment-936541574 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 39s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 7 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 34m 49s | | trunk passed | | +1 :green_heart: | compile | 1m 4s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 0m 54s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 0m 49s | | trunk passed | | +1 :green_heart: | mvnsite | 1m 0s | | trunk passed | | +1 :green_heart: | javadoc | 0m 49s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 45s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 1m 52s | | trunk passed | | +1 :green_heart: | shadedclient | 20m 4s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 51s | | the patch passed | | +1 :green_heart: | compile | 0m 55s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 0m 55s | | the patch passed | | +1 :green_heart: | compile | 0m 46s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 0m 46s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 0m 39s | | hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager: The patch generated 0 new + 148 unchanged - 2 fixed = 148 total (was 150) | | +1 :green_heart: | mvnsite | 0m 50s | | the patch passed | | +1 :green_heart: | javadoc | 0m 36s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 0m 34s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 1m 52s | | the patch passed | | +1 :green_heart: | shadedclient | 19m 44s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 95m 40s | | hadoop-yarn-server-resourcemanager in the patch passed. | | +1 :green_heart: | asflicense | 0m 35s | | The patch does not generate ASF License warnings. | | | | 185m 34s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3459/11/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3459 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell | | uname | Linux d5488b232bf9 4.15.0-156-generic #163-Ubuntu SMP Thu Aug 19 23:31:58 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / f46e7308796b615bf19a3b3bdf1df273043353cb | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3459/11/testReport/ | | Max. process+thread count | 963 (vs. ulimit of 5500) | | modules | C: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager U: hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3459/11/console | | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please
[GitHub] [hadoop] symious commented on pull request #3524: HDFS-16257. Set initialCapacity for guava cache to solve performance issue
symious commented on pull request #3524: URL: https://github.com/apache/hadoop/pull/3524#issuecomment-936503908 @ayushtkn Thanks for the review. Trunk uses guava from org.apache.hadoop.thirdparty.hadoop-shaded-guava:1.1.1, the guava cache in it should have fixed the bug. Tested with the above dependency, the overhead is eliminated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] ayushtkn commented on pull request #3524: HDFS-16257. Set initialCapacity for guava cache to solve performance issue
ayushtkn commented on pull request #3524: URL: https://github.com/apache/hadoop/pull/3524#issuecomment-936419231 Can we have this change merged to trunk as well, and then backport it till 2.10? Any pointers, can this have some adverse affects on trunk version of Guava? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus removed a comment on pull request #2971: MAPREDUCE-7341. Intermediate Manifest Committer
hadoop-yetus removed a comment on pull request #2971: URL: https://github.com/apache/hadoop/pull/2971#issuecomment-924119867 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus removed a comment on pull request #2971: MAPREDUCE-7341. Intermediate Manifest Committer
hadoop-yetus removed a comment on pull request #2971: URL: https://github.com/apache/hadoop/pull/2971#issuecomment-918490144 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17945) JsonSerialization raises EOFException reading JSON data stored on google GCS
[ https://issues.apache.org/jira/browse/HADOOP-17945?focusedWorklogId=660963=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-660963 ] ASF GitHub Bot logged work on HADOOP-17945: --- Author: ASF GitHub Bot Created on: 06/Oct/21 14:46 Start Date: 06/Oct/21 14:46 Worklog Time Spent: 10m Work Description: steveloughran commented on a change in pull request #3501: URL: https://github.com/apache/hadoop/pull/3501#discussion_r723354680 ## File path: hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestJsonSerialization.java ## @@ -181,5 +197,23 @@ public void testFileSystemEmptyPath() throws Throwable { } } + /** + * 0 byte file through the load(path, status) API will fail with an Review comment: probably -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 660963) Time Spent: 1.5h (was: 1h 20m) > JsonSerialization raises EOFException reading JSON data stored on google GCS > > > Key: HADOOP-17945 > URL: https://issues.apache.org/jira/browse/HADOOP-17945 > Project: Hadoop Common > Issue Type: Bug > Components: fs >Affects Versions: 3.3.1 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 1.5h > Remaining Estimate: 0h > > The JsonSerialization<> load code doesn't work on gcs as it uses > "stream.available()" to fail with a meaningful message if the stream is empty. > But that method is meant to say how much data is available without blocking, > something we actually get wrong ourselves. Google GCS team didn't get it > wrong, so on a read(), if there's no local buffer, an EOFException is raised -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on a change in pull request #3501: HADOOP-17945. JsonSerialization raises EOFException reading JSON data stored on google GCS
steveloughran commented on a change in pull request #3501: URL: https://github.com/apache/hadoop/pull/3501#discussion_r723354680 ## File path: hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/util/TestJsonSerialization.java ## @@ -181,5 +197,23 @@ public void testFileSystemEmptyPath() throws Throwable { } } + /** + * 0 byte file through the load(path, status) API will fail with an Review comment: probably -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on a change in pull request #3501: HADOOP-17945. JsonSerialization raises EOFException reading JSON data stored on google GCS
steveloughran commented on a change in pull request #3501: URL: https://github.com/apache/hadoop/pull/3501#discussion_r723354403 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/JsonSerialization.java ## @@ -229,30 +235,44 @@ public T fromInstance(T instance) throws IOException { /** * Load from a Hadoop filesystem. - * There's a check for data availability after the file is open, by - * raising an EOFException if stream.available == 0. - * This allows for a meaningful exception without the round trip overhead - * of a getFileStatus call before opening the file. It may be brittle - * against an FS stream which doesn't return a value here, but the - * standard filesystems all do. - * JSON parsing and mapping problems - * are converted to IOEs. * @param fs filesystem * @param path path * @return a loaded object - * @throws IOException IO or JSON parse problems + * @throws PathIOException JSON parse problem + * @throws IOException IO problems */ public T load(FileSystem fs, Path path) throws IOException { -try (FSDataInputStream dataInputStream = fs.open(path)) { - // throw an EOF exception if there is no data available. - if (dataInputStream.available() == 0) { -throw new EOFException("No data in " + path); - } +return load(fs, path, null); + } + + /** + * Load from a Hadoop filesystem. + * If a file status is supplied, it's passed in to the openFile() + * call so that FS implementations can optimize their opening. + * @param fs filesystem + * @param path path + * @param status status of the file to open. + * @return a loaded object + * @throws PathIOException JSON parse problem + * @throws EOFException file status references an empty file + * @throws IOException IO problems + */ + public T load(FileSystem fs, Path path, @Nullable FileStatus status) + throws IOException { + +if (status != null && status.getLen() == 0) { + throw new EOFException("No data in " + path); Review comment: it will read and whatever happens on the read "happens"; maybe a Json parse exception. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17945) JsonSerialization raises EOFException reading JSON data stored on google GCS
[ https://issues.apache.org/jira/browse/HADOOP-17945?focusedWorklogId=660961=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-660961 ] ASF GitHub Bot logged work on HADOOP-17945: --- Author: ASF GitHub Bot Created on: 06/Oct/21 14:45 Start Date: 06/Oct/21 14:45 Worklog Time Spent: 10m Work Description: steveloughran commented on a change in pull request #3501: URL: https://github.com/apache/hadoop/pull/3501#discussion_r723354403 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/JsonSerialization.java ## @@ -229,30 +235,44 @@ public T fromInstance(T instance) throws IOException { /** * Load from a Hadoop filesystem. - * There's a check for data availability after the file is open, by - * raising an EOFException if stream.available == 0. - * This allows for a meaningful exception without the round trip overhead - * of a getFileStatus call before opening the file. It may be brittle - * against an FS stream which doesn't return a value here, but the - * standard filesystems all do. - * JSON parsing and mapping problems - * are converted to IOEs. * @param fs filesystem * @param path path * @return a loaded object - * @throws IOException IO or JSON parse problems + * @throws PathIOException JSON parse problem + * @throws IOException IO problems */ public T load(FileSystem fs, Path path) throws IOException { -try (FSDataInputStream dataInputStream = fs.open(path)) { - // throw an EOF exception if there is no data available. - if (dataInputStream.available() == 0) { -throw new EOFException("No data in " + path); - } +return load(fs, path, null); + } + + /** + * Load from a Hadoop filesystem. + * If a file status is supplied, it's passed in to the openFile() + * call so that FS implementations can optimize their opening. + * @param fs filesystem + * @param path path + * @param status status of the file to open. + * @return a loaded object + * @throws PathIOException JSON parse problem + * @throws EOFException file status references an empty file + * @throws IOException IO problems + */ + public T load(FileSystem fs, Path path, @Nullable FileStatus status) + throws IOException { + +if (status != null && status.getLen() == 0) { + throw new EOFException("No data in " + path); Review comment: it will read and whatever happens on the read "happens"; maybe a Json parse exception. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 660961) Time Spent: 1h 20m (was: 1h 10m) > JsonSerialization raises EOFException reading JSON data stored on google GCS > > > Key: HADOOP-17945 > URL: https://issues.apache.org/jira/browse/HADOOP-17945 > Project: Hadoop Common > Issue Type: Bug > Components: fs >Affects Versions: 3.3.1 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 1h 20m > Remaining Estimate: 0h > > The JsonSerialization<> load code doesn't work on gcs as it uses > "stream.available()" to fail with a meaningful message if the stream is empty. > But that method is meant to say how much data is available without blocking, > something we actually get wrong ourselves. Google GCS team didn't get it > wrong, so on a read(), if there's no local buffer, an EOFException is raised -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17945) JsonSerialization raises EOFException reading JSON data stored on google GCS
[ https://issues.apache.org/jira/browse/HADOOP-17945?focusedWorklogId=660960=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-660960 ] ASF GitHub Bot logged work on HADOOP-17945: --- Author: ASF GitHub Bot Created on: 06/Oct/21 14:45 Start Date: 06/Oct/21 14:45 Worklog Time Spent: 10m Work Description: steveloughran commented on a change in pull request #3501: URL: https://github.com/apache/hadoop/pull/3501#discussion_r723354042 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/JsonSerialization.java ## @@ -229,30 +235,44 @@ public T fromInstance(T instance) throws IOException { /** * Load from a Hadoop filesystem. - * There's a check for data availability after the file is open, by - * raising an EOFException if stream.available == 0. - * This allows for a meaningful exception without the round trip overhead - * of a getFileStatus call before opening the file. It may be brittle - * against an FS stream which doesn't return a value here, but the - * standard filesystems all do. - * JSON parsing and mapping problems - * are converted to IOEs. * @param fs filesystem * @param path path * @return a loaded object - * @throws IOException IO or JSON parse problems + * @throws PathIOException JSON parse problem + * @throws IOException IO problems */ public T load(FileSystem fs, Path path) throws IOException { -try (FSDataInputStream dataInputStream = fs.open(path)) { - // throw an EOF exception if there is no data available. - if (dataInputStream.available() == 0) { Review comment: yes. Available is only reporting buffer size, not length of data -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 660960) Time Spent: 1h 10m (was: 1h) > JsonSerialization raises EOFException reading JSON data stored on google GCS > > > Key: HADOOP-17945 > URL: https://issues.apache.org/jira/browse/HADOOP-17945 > Project: Hadoop Common > Issue Type: Bug > Components: fs >Affects Versions: 3.3.1 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Labels: pull-request-available > Time Spent: 1h 10m > Remaining Estimate: 0h > > The JsonSerialization<> load code doesn't work on gcs as it uses > "stream.available()" to fail with a meaningful message if the stream is empty. > But that method is meant to say how much data is available without blocking, > something we actually get wrong ourselves. Google GCS team didn't get it > wrong, so on a read(), if there's no local buffer, an EOFException is raised -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on a change in pull request #3501: HADOOP-17945. JsonSerialization raises EOFException reading JSON data stored on google GCS
steveloughran commented on a change in pull request #3501: URL: https://github.com/apache/hadoop/pull/3501#discussion_r723354042 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/JsonSerialization.java ## @@ -229,30 +235,44 @@ public T fromInstance(T instance) throws IOException { /** * Load from a Hadoop filesystem. - * There's a check for data availability after the file is open, by - * raising an EOFException if stream.available == 0. - * This allows for a meaningful exception without the round trip overhead - * of a getFileStatus call before opening the file. It may be brittle - * against an FS stream which doesn't return a value here, but the - * standard filesystems all do. - * JSON parsing and mapping problems - * are converted to IOEs. * @param fs filesystem * @param path path * @return a loaded object - * @throws IOException IO or JSON parse problems + * @throws PathIOException JSON parse problem + * @throws IOException IO problems */ public T load(FileSystem fs, Path path) throws IOException { -try (FSDataInputStream dataInputStream = fs.open(path)) { - // throw an EOF exception if there is no data available. - if (dataInputStream.available() == 0) { Review comment: yes. Available is only reporting buffer size, not length of data -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17952) Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
[ https://issues.apache.org/jira/browse/HADOOP-17952?focusedWorklogId=660957=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-660957 ] ASF GitHub Bot logged work on HADOOP-17952: --- Author: ASF GitHub Bot Created on: 06/Oct/21 14:43 Start Date: 06/Oct/21 14:43 Worklog Time Spent: 10m Work Description: amahussein commented on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-936393905 I saw the error below but the Details link is not available. ``` Some checks were not successful 1 errored and 2 successful checks @asf-cloudbees-jenkins-ci-hadoop continuous-integration/jenkins/pr-merge — The build of this commit was aborted ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 660957) Time Spent: 4h (was: 3h 50m) > Replace Guava VisibleForTesting by Hadoop's own annotation in > hadoop-common-project modules > --- > > Key: HADOOP-17952 > URL: https://issues.apache.org/jira/browse/HADOOP-17952 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > Time Spent: 4h > Remaining Estimate: 0h > -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] amahussein commented on pull request #3503: HADOOP-17952. Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
amahussein commented on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-936393905 I saw the error below but the Details link is not available. ``` Some checks were not successful 1 errored and 2 successful checks @asf-cloudbees-jenkins-ci-hadoop continuous-integration/jenkins/pr-merge — The build of this commit was aborted ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17952) Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
[ https://issues.apache.org/jira/browse/HADOOP-17952?focusedWorklogId=660955=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-660955 ] ASF GitHub Bot logged work on HADOOP-17952: --- Author: ASF GitHub Bot Created on: 06/Oct/21 14:40 Start Date: 06/Oct/21 14:40 Worklog Time Spent: 10m Work Description: virajjasani edited a comment on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-936374675 > Probably error on the build side. @virajjasani can you try another push? hopefully it works this time. @amahussein sorry, didn't get you. The latest build is already +1, it's with latest push only. https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3503/8/console Build#7 failed due to insufficient rebase but build#8 is the latest after proper rebase with trunk. Do we still want one more build? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 660955) Time Spent: 3h 50m (was: 3h 40m) > Replace Guava VisibleForTesting by Hadoop's own annotation in > hadoop-common-project modules > --- > > Key: HADOOP-17952 > URL: https://issues.apache.org/jira/browse/HADOOP-17952 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > Time Spent: 3h 50m > Remaining Estimate: 0h > -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] virajjasani edited a comment on pull request #3503: HADOOP-17952. Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
virajjasani edited a comment on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-936374675 > Probably error on the build side. @virajjasani can you try another push? hopefully it works this time. @amahussein sorry, didn't get you. The latest build is already +1, it's with latest push only. https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3503/8/console Build#7 failed due to insufficient rebase but build#8 is the latest after proper rebase with trunk. Do we still want one more build? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17952) Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
[ https://issues.apache.org/jira/browse/HADOOP-17952?focusedWorklogId=660954=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-660954 ] ASF GitHub Bot logged work on HADOOP-17952: --- Author: ASF GitHub Bot Created on: 06/Oct/21 14:39 Start Date: 06/Oct/21 14:39 Worklog Time Spent: 10m Work Description: virajjasani edited a comment on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-936374675 > Probably error on the build side. @virajjasani can you try another push? hopefully it works this time. @amahussein sorry, didn't get you. The latest build is already +1, it's with latest push only. https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3503/8/console Build#7 failed due to insufficient rebase but build#8 is the latest. Do we still want one more build? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 660954) Time Spent: 3h 40m (was: 3.5h) > Replace Guava VisibleForTesting by Hadoop's own annotation in > hadoop-common-project modules > --- > > Key: HADOOP-17952 > URL: https://issues.apache.org/jira/browse/HADOOP-17952 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > Time Spent: 3h 40m > Remaining Estimate: 0h > -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] virajjasani edited a comment on pull request #3503: HADOOP-17952. Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
virajjasani edited a comment on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-936374675 > Probably error on the build side. @virajjasani can you try another push? hopefully it works this time. @amahussein sorry, didn't get you. The latest build is already +1, it's with latest push only. https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3503/8/console Build#7 failed due to insufficient rebase but build#8 is the latest. Do we still want one more build? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17952) Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
[ https://issues.apache.org/jira/browse/HADOOP-17952?focusedWorklogId=660951=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-660951 ] ASF GitHub Bot logged work on HADOOP-17952: --- Author: ASF GitHub Bot Created on: 06/Oct/21 14:38 Start Date: 06/Oct/21 14:38 Worklog Time Spent: 10m Work Description: virajjasani commented on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-936374675 > Probably error on the build side. @virajjasani can you try another push? hopefully it works this time. @amahussein sorry, didn't get you. The latest build is already +1, it's with latest push only. https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3503/8/console -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 660951) Time Spent: 3.5h (was: 3h 20m) > Replace Guava VisibleForTesting by Hadoop's own annotation in > hadoop-common-project modules > --- > > Key: HADOOP-17952 > URL: https://issues.apache.org/jira/browse/HADOOP-17952 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > Time Spent: 3.5h > Remaining Estimate: 0h > -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] virajjasani commented on pull request #3503: HADOOP-17952. Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
virajjasani commented on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-936374675 > Probably error on the build side. @virajjasani can you try another push? hopefully it works this time. @amahussein sorry, didn't get you. The latest build is already +1, it's with latest push only. https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3503/8/console -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17952) Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
[ https://issues.apache.org/jira/browse/HADOOP-17952?focusedWorklogId=660936=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-660936 ] ASF GitHub Bot logged work on HADOOP-17952: --- Author: ASF GitHub Bot Created on: 06/Oct/21 14:21 Start Date: 06/Oct/21 14:21 Worklog Time Spent: 10m Work Description: amahussein commented on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-936327225 Probably error on the build side. @virajjasani can you try another push? hopefully it works this time. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 660936) Time Spent: 3h 20m (was: 3h 10m) > Replace Guava VisibleForTesting by Hadoop's own annotation in > hadoop-common-project modules > --- > > Key: HADOOP-17952 > URL: https://issues.apache.org/jira/browse/HADOOP-17952 > Project: Hadoop Common > Issue Type: Sub-task >Reporter: Viraj Jasani >Assignee: Viraj Jasani >Priority: Major > Labels: pull-request-available > Time Spent: 3h 20m > Remaining Estimate: 0h > -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] amahussein commented on pull request #3503: HADOOP-17952. Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
amahussein commented on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-936327225 Probably error on the build side. @virajjasani can you try another push? hopefully it works this time. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work started] (HADOOP-17953) S3A: ITestS3AFileContextStatistics test to lookup global or per-bucket configuration for encryption algorithm
[ https://issues.apache.org/jira/browse/HADOOP-17953?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Work on HADOOP-17953 started by Mehakmeet Singh. > S3A: ITestS3AFileContextStatistics test to lookup global or per-bucket > configuration for encryption algorithm > - > > Key: HADOOP-17953 > URL: https://issues.apache.org/jira/browse/HADOOP-17953 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.4.0 >Reporter: Mehakmeet Singh >Assignee: Mehakmeet Singh >Priority: Major > Labels: pull-request-available > Time Spent: 10m > Remaining Estimate: 0h > > ITestS3AFileContextStatistics uses the conf.get() method to get the global > configuration for encryption algorithm and keys, but the per-bucket > configuration would be ignored in this case. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17254) Upgrade hbase to 1.4.13 on branch-2.10
[ https://issues.apache.org/jira/browse/HADOOP-17254?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17425002#comment-17425002 ] Masatake Iwasaki commented on HADOOP-17254: --- I cherry-picked YARN-8936 to branch-3.2.3 too. > Upgrade hbase to 1.4.13 on branch-2.10 > -- > > Key: HADOOP-17254 > URL: https://issues.apache.org/jira/browse/HADOOP-17254 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Masatake Iwasaki >Assignee: Masatake Iwasaki >Priority: Major > Labels: pull-request-available > Fix For: 2.10.1 > > Time Spent: 50m > Remaining Estimate: 0h > > hbase.version must be updated to address CVE-2018-8025 on branch-2.10. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17953) S3A: ITestS3AFileContextStatistics test to lookup global or per-bucket configuration for encryption algorithm
[ https://issues.apache.org/jira/browse/HADOOP-17953?focusedWorklogId=660912=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-660912 ] ASF GitHub Bot logged work on HADOOP-17953: --- Author: ASF GitHub Bot Created on: 06/Oct/21 13:26 Start Date: 06/Oct/21 13:26 Worklog Time Spent: 10m Work Description: mehakmeet opened a new pull request #3525: URL: https://github.com/apache/hadoop/pull/3525 Region: ap-south-1 ### CSE: `[WARNING] Tests run: 588, Failures: 0, Errors: 0, Skipped: 5` ```[ERROR] Errors: [ERROR] ITestS3AMiscOperationCost.testGetContentSummaryRoot:96->AbstractS3ACostTest.verifyMetrics:391->lambda$testGetContentSummaryRoot$1:96->getContentSummary:140 » TestTimedOut [ERROR] ITestS3AMiscOperationCost.testGetContentSummaryRoot:96->AbstractS3ACostTest.verifyMetrics:391->lambda$testGetContentSummaryRoot$1:96->getContentSummary:140 » TestTimedOut [INFO] [ERROR] Tests run: 1471, Failures: 0, Errors: 2, Skipped: 636 ``` ``` [ERROR] Errors: [ERROR] ITestS3AContractRootDir>AbstractContractRootDirectoryTest.testRecursiveRootListing:267 » TestTimedOut [INFO] [ERROR] Tests run: 151, Failures: 1, Errors: 1, Skipped: 28 ``` ### normal: `[WARNING] Tests run: 588, Failures: 0, Errors: 0, Skipped: 5` ``` [ERROR] Errors: [ERROR] ITestS3AMiscOperationCost.testGetContentSummaryRoot:96->AbstractS3ACostTest.verifyMetrics:391->lambda$testGetContentSummaryRoot$1:96->getContentSummary:140 » TestTimedOut [ERROR] ITestS3AMiscOperationCost.testGetContentSummaryRoot:96->AbstractS3ACostTest.verifyMetrics:391->lambda$testGetContentSummaryRoot$1:96->getContentSummary:140 » TestTimedOut [INFO] [ERROR] Tests run: 1471, Failures: 0, Errors: 2, Skipped: 466 ``` ``` [ERROR] ITestS3AContractRootDir>AbstractContractRootDirectoryTest.testRecursiveRootListing:267 » TestTimedOut [INFO] [ERROR] Tests run: 151, Failures: 1, Errors: 1, Skipped: 28 ``` ### s3guard `[WARNING] Tests run: 588, Failures: 0, Errors: 0, Skipped: 5` `[INFO] [ERROR] Tests run: 1471, Failures: 1, Errors: 3, Skipped: 387` ` [INFO] [ERROR] Tests run: 11, Failures: 0, Errors: 3, Skipped: 0 ` ### CSE-S3Guard `[WARNING] Tests run: 588, Failures: 0, Errors: 0, Skipped: 5` `[WARNING] Tests run: 1471, Failures: 0, Errors: 0, Skipped: 1271` `[WARNING] Tests run: 151, Failures: 0, Errors: 0, Skipped: 92` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking --- Worklog Id: (was: 660912) Remaining Estimate: 0h Time Spent: 10m > S3A: ITestS3AFileContextStatistics test to lookup global or per-bucket > configuration for encryption algorithm > - > > Key: HADOOP-17953 > URL: https://issues.apache.org/jira/browse/HADOOP-17953 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.4.0 >Reporter: Mehakmeet Singh >Assignee: Mehakmeet Singh >Priority: Major > Time Spent: 10m > Remaining Estimate: 0h > > ITestS3AFileContextStatistics uses the conf.get() method to get the global > configuration for encryption algorithm and keys, but the per-bucket > configuration would be ignored in this case. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17953) S3A: ITestS3AFileContextStatistics test to lookup global or per-bucket configuration for encryption algorithm
[ https://issues.apache.org/jira/browse/HADOOP-17953?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HADOOP-17953: Labels: pull-request-available (was: ) > S3A: ITestS3AFileContextStatistics test to lookup global or per-bucket > configuration for encryption algorithm > - > > Key: HADOOP-17953 > URL: https://issues.apache.org/jira/browse/HADOOP-17953 > Project: Hadoop Common > Issue Type: Bug > Components: fs/s3 >Affects Versions: 3.4.0 >Reporter: Mehakmeet Singh >Assignee: Mehakmeet Singh >Priority: Major > Labels: pull-request-available > Time Spent: 10m > Remaining Estimate: 0h > > ITestS3AFileContextStatistics uses the conf.get() method to get the global > configuration for encryption algorithm and keys, but the per-bucket > configuration would be ignored in this case. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mehakmeet opened a new pull request #3525: HADOOP-17953. S3A: Tests to lookup global or per-bucket configuration for encryption algorithm
mehakmeet opened a new pull request #3525: URL: https://github.com/apache/hadoop/pull/3525 Region: ap-south-1 ### CSE: `[WARNING] Tests run: 588, Failures: 0, Errors: 0, Skipped: 5` ```[ERROR] Errors: [ERROR] ITestS3AMiscOperationCost.testGetContentSummaryRoot:96->AbstractS3ACostTest.verifyMetrics:391->lambda$testGetContentSummaryRoot$1:96->getContentSummary:140 » TestTimedOut [ERROR] ITestS3AMiscOperationCost.testGetContentSummaryRoot:96->AbstractS3ACostTest.verifyMetrics:391->lambda$testGetContentSummaryRoot$1:96->getContentSummary:140 » TestTimedOut [INFO] [ERROR] Tests run: 1471, Failures: 0, Errors: 2, Skipped: 636 ``` ``` [ERROR] Errors: [ERROR] ITestS3AContractRootDir>AbstractContractRootDirectoryTest.testRecursiveRootListing:267 » TestTimedOut [INFO] [ERROR] Tests run: 151, Failures: 1, Errors: 1, Skipped: 28 ``` ### normal: `[WARNING] Tests run: 588, Failures: 0, Errors: 0, Skipped: 5` ``` [ERROR] Errors: [ERROR] ITestS3AMiscOperationCost.testGetContentSummaryRoot:96->AbstractS3ACostTest.verifyMetrics:391->lambda$testGetContentSummaryRoot$1:96->getContentSummary:140 » TestTimedOut [ERROR] ITestS3AMiscOperationCost.testGetContentSummaryRoot:96->AbstractS3ACostTest.verifyMetrics:391->lambda$testGetContentSummaryRoot$1:96->getContentSummary:140 » TestTimedOut [INFO] [ERROR] Tests run: 1471, Failures: 0, Errors: 2, Skipped: 466 ``` ``` [ERROR] ITestS3AContractRootDir>AbstractContractRootDirectoryTest.testRecursiveRootListing:267 » TestTimedOut [INFO] [ERROR] Tests run: 151, Failures: 1, Errors: 1, Skipped: 28 ``` ### s3guard `[WARNING] Tests run: 588, Failures: 0, Errors: 0, Skipped: 5` `[INFO] [ERROR] Tests run: 1471, Failures: 1, Errors: 3, Skipped: 387` ` [INFO] [ERROR] Tests run: 11, Failures: 0, Errors: 3, Skipped: 0 ` ### CSE-S3Guard `[WARNING] Tests run: 588, Failures: 0, Errors: 0, Skipped: 5` `[WARNING] Tests run: 1471, Failures: 0, Errors: 0, Skipped: 1271` `[WARNING] Tests run: 151, Failures: 0, Errors: 0, Skipped: 92` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17254) Upgrade hbase to 1.4.13 on branch-2.10
[ https://issues.apache.org/jira/browse/HADOOP-17254?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17424986#comment-17424986 ] Brahma Reddy Battula commented on HADOOP-17254: --- {quote} I think we should cherry-pick YARN-8936 to branch-3.2 rather than this (HADOOP-17254) if applicable. {quote} Yes {quote}[~brahmareddy] Are you OK to push it to branch-3.2.3 too? {quote} Yes, Please. > Upgrade hbase to 1.4.13 on branch-2.10 > -- > > Key: HADOOP-17254 > URL: https://issues.apache.org/jira/browse/HADOOP-17254 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Masatake Iwasaki >Assignee: Masatake Iwasaki >Priority: Major > Labels: pull-request-available > Fix For: 2.10.1 > > Time Spent: 50m > Remaining Estimate: 0h > > hbase.version must be updated to address CVE-2018-8025 on branch-2.10. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17954) org.apache.spark.SparkException: Task failed while writing rows S3
[ https://issues.apache.org/jira/browse/HADOOP-17954?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] sudarshan updated HADOOP-17954: --- Description: I am trying to run spark job (1.6.0) which reads rows from HBASE and does some transformation and finally writes to S3 . Some time i can notice error because of time out . Task is able to write to S3 but at last stage it fails Here is the error details Job aborted due to stage failure: Task 1074 in stage 1.0 failed 4 times, most recent failure: Lost task 1074.3 in stage 1.0 (TID 2162, abcd.ecom.bigdata.int.abcd.com, executor 18): org.apache.spark.SparkException: Task failed while writing rowsJob aborted due to stage failure: Task 1074 in stage 1.0 failed 4 times, most recent failure: Lost task 1074.3 in stage 1.0 (TID 2162, abcd.ecom.bigdata.int.abcd.com, executor 18): org.apache.spark.SparkException: Task failed while writing rows at org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer.writeRows(WriterContainer.scala:417) at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation$$anonfun$run$1$$anonfun$apply$mcV$sp$3.apply(InsertIntoHadoopFsRelation.scala:148) at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation$$anonfun$run$1$$anonfun$apply$mcV$sp$3.apply(InsertIntoHadoopFsRelation.scala:148) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:89) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:242) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)Caused by: org.apache.hadoop.fs.s3a.AWSS3IOException: saving output on common/hbaseHistory/metadataSept100621/_temporary/_attempt_202110060911_0001_m_001074_3/year=2021/month=09/submitDate=2021-09-08T04%3a58%3a47Z/part-r-01074-205c8b21-7840-4985-bb0e-65ed787c337d.snappy.parquet: com.cloudera.com.amazonaws.services.s3.model.AmazonS3Exception: Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed. (Service: Amazon S3; Status Code: 400; Error Code: RequestTimeout; Request ID: 5J85XRNF10W16ZJS), S3 Extended Request ID: 4g08KHEDbFs5jueJqt9Snw7Xlmw5VeS1eCtJyAzp0fzHGinMhBntwIEhddJP7LLaS0teR3EAuOI=: Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed. (Service: Amazon S3; Status Code: 400; Error Code: RequestTimeout; Request ID: 5J85XRNF10W16ZJS) at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:143) at org.apache.hadoop.fs.s3a.S3AOutputStream.close(S3AOutputStream.java:123) at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72) at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106) at parquet.hadoop.ParquetFileWriter.end(ParquetFileWriter.java:470) at parquet.hadoop.InternalParquetRecordWriter.close(InternalParquetRecordWriter.java:112) at parquet.hadoop.ParquetRecordWriter.close(ParquetRecordWriter.java:112) at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.close(ParquetRelation.scala:101) at org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer$$anonfun$writeRows$4.apply$mcV$sp(WriterContainer.scala:387) at org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer$$anonfun$writeRows$4.apply(WriterContainer.scala:343) at org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer$$anonfun$writeRows$4.apply(WriterContainer.scala:343) at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1278) at org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer.writeRows(WriterContainer.scala:409) ... 8 more Suppressed: java.lang.NullPointerException at parquet.hadoop.InternalParquetRecordWriter.flushRowGroupToStore(InternalParquetRecordWriter.java:152) at parquet.hadoop.InternalParquetRecordWriter.close(InternalParquetRecordWriter.java:111) at parquet.hadoop.ParquetRecordWriter.close(ParquetRecordWriter.java:112) at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.close(ParquetRelation.scala:101) at org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer$$anonfun$writeRows$5.apply$mcV$sp(WriterContainer.scala:411) at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1287) ... 9 moreCaused by: com.cloudera.com.amazonaws.services.s3.model.AmazonS3Exception: Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed. (Service: Amazon S3; Status Code: 400; Error Code: RequestTimeout; Request ID: 5J85XRNF10W16ZJS), S3
[jira] [Updated] (HADOOP-17954) org.apache.spark.SparkException: Task failed while writing rows S3
[ https://issues.apache.org/jira/browse/HADOOP-17954?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] sudarshan updated HADOOP-17954: --- Summary: org.apache.spark.SparkException: Task failed while writing rows S3 (was: com.cloudera.com.amazonaws.services.s3.model.AmazonS3Exception: Your socket connection to the server was not read from or written to within the timeout period) > org.apache.spark.SparkException: Task failed while writing rows S3 > -- > > Key: HADOOP-17954 > URL: https://issues.apache.org/jira/browse/HADOOP-17954 > Project: Hadoop Common > Issue Type: Bug > Components: hadoop-thirdparty >Affects Versions: 2.6.0 >Reporter: sudarshan >Priority: Major > > I am trying to run spark job (1.6.0) which reads rows from HBASE and does > some transformation and finally writes to S3 . > Some time i can notice error because of time out . > Here is the error details > > Job aborted due to stage failure: Task 1074 in stage 1.0 failed 4 times, most > recent failure: Lost task 1074.3 in stage 1.0 (TID 2162, > abcd.ecom.bigdata.int.abcd.com, executor 18): > org.apache.spark.SparkException: Task failed while writing rowsJob aborted > due to stage failure: Task 1074 in stage 1.0 failed 4 times, most recent > failure: Lost task 1074.3 in stage 1.0 (TID 2162, > abcd.ecom.bigdata.int.abcd.com, executor 18): > org.apache.spark.SparkException: Task failed while writing rows at > org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer.writeRows(WriterContainer.scala:417) > at > org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation$$anonfun$run$1$$anonfun$apply$mcV$sp$3.apply(InsertIntoHadoopFsRelation.scala:148) > at > org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation$$anonfun$run$1$$anonfun$apply$mcV$sp$3.apply(InsertIntoHadoopFsRelation.scala:148) > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at > org.apache.spark.scheduler.Task.run(Task.scala:89) at > org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:242) at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) > at java.lang.Thread.run(Thread.java:748)Caused by: > org.apache.hadoop.fs.s3a.AWSS3IOException: saving output on > common/hbaseHistory/metadataSept100621/_temporary/_attempt_202110060911_0001_m_001074_3/year=2021/month=09/submitDate=2021-09-08T04%3a58%3a47Z/part-r-01074-205c8b21-7840-4985-bb0e-65ed787c337d.snappy.parquet: > com.cloudera.com.amazonaws.services.s3.model.AmazonS3Exception: Your socket > connection to the server was not read from or written to within the timeout > period. Idle connections will be closed. (Service: Amazon S3; Status Code: > 400; Error Code: RequestTimeout; Request ID: 5J85XRNF10W16ZJS), S3 Extended > Request ID: > 4g08KHEDbFs5jueJqt9Snw7Xlmw5VeS1eCtJyAzp0fzHGinMhBntwIEhddJP7LLaS0teR3EAuOI=: > Your socket connection to the server was not read from or written to within > the timeout period. Idle connections will be closed. (Service: Amazon S3; > Status Code: 400; Error Code: RequestTimeout; Request ID: 5J85XRNF10W16ZJS) > at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:143) at > org.apache.hadoop.fs.s3a.S3AOutputStream.close(S3AOutputStream.java:123) at > org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72) > at > org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106) at > parquet.hadoop.ParquetFileWriter.end(ParquetFileWriter.java:470) at > parquet.hadoop.InternalParquetRecordWriter.close(InternalParquetRecordWriter.java:112) > at parquet.hadoop.ParquetRecordWriter.close(ParquetRecordWriter.java:112) at > org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.close(ParquetRelation.scala:101) > at > org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer$$anonfun$writeRows$4.apply$mcV$sp(WriterContainer.scala:387) > at > org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer$$anonfun$writeRows$4.apply(WriterContainer.scala:343) > at > org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer$$anonfun$writeRows$4.apply(WriterContainer.scala:343) > at > org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1278) > at > org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer.writeRows(WriterContainer.scala:409) > ... 8 more Suppressed: java.lang.NullPointerException at > parquet.hadoop.InternalParquetRecordWriter.flushRowGroupToStore(InternalParquetRecordWriter.java:152) > at > parquet.hadoop.InternalParquetRecordWriter.close(InternalParquetRecordWriter.java:111)
[jira] [Created] (HADOOP-17954) com.cloudera.com.amazonaws.services.s3.model.AmazonS3Exception: Your socket connection to the server was not read from or written to within the timeout period
sudarshan created HADOOP-17954: -- Summary: com.cloudera.com.amazonaws.services.s3.model.AmazonS3Exception: Your socket connection to the server was not read from or written to within the timeout period Key: HADOOP-17954 URL: https://issues.apache.org/jira/browse/HADOOP-17954 Project: Hadoop Common Issue Type: Bug Components: hadoop-thirdparty Affects Versions: 2.6.0 Reporter: sudarshan I am trying to run spark job (1.6.0) which reads rows from HBASE and does some transformation and finally writes to S3 . Some time i can notice error because of time out . Here is the error details Job aborted due to stage failure: Task 1074 in stage 1.0 failed 4 times, most recent failure: Lost task 1074.3 in stage 1.0 (TID 2162, abcd.ecom.bigdata.int.abcd.com, executor 18): org.apache.spark.SparkException: Task failed while writing rowsJob aborted due to stage failure: Task 1074 in stage 1.0 failed 4 times, most recent failure: Lost task 1074.3 in stage 1.0 (TID 2162, abcd.ecom.bigdata.int.abcd.com, executor 18): org.apache.spark.SparkException: Task failed while writing rows at org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer.writeRows(WriterContainer.scala:417) at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation$$anonfun$run$1$$anonfun$apply$mcV$sp$3.apply(InsertIntoHadoopFsRelation.scala:148) at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation$$anonfun$run$1$$anonfun$apply$mcV$sp$3.apply(InsertIntoHadoopFsRelation.scala:148) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:89) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:242) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)Caused by: org.apache.hadoop.fs.s3a.AWSS3IOException: saving output on common/hbaseHistory/metadataSept100621/_temporary/_attempt_202110060911_0001_m_001074_3/year=2021/month=09/submitDate=2021-09-08T04%3a58%3a47Z/part-r-01074-205c8b21-7840-4985-bb0e-65ed787c337d.snappy.parquet: com.cloudera.com.amazonaws.services.s3.model.AmazonS3Exception: Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed. (Service: Amazon S3; Status Code: 400; Error Code: RequestTimeout; Request ID: 5J85XRNF10W16ZJS), S3 Extended Request ID: 4g08KHEDbFs5jueJqt9Snw7Xlmw5VeS1eCtJyAzp0fzHGinMhBntwIEhddJP7LLaS0teR3EAuOI=: Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed. (Service: Amazon S3; Status Code: 400; Error Code: RequestTimeout; Request ID: 5J85XRNF10W16ZJS) at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:143) at org.apache.hadoop.fs.s3a.S3AOutputStream.close(S3AOutputStream.java:123) at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72) at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106) at parquet.hadoop.ParquetFileWriter.end(ParquetFileWriter.java:470) at parquet.hadoop.InternalParquetRecordWriter.close(InternalParquetRecordWriter.java:112) at parquet.hadoop.ParquetRecordWriter.close(ParquetRecordWriter.java:112) at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.close(ParquetRelation.scala:101) at org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer$$anonfun$writeRows$4.apply$mcV$sp(WriterContainer.scala:387) at org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer$$anonfun$writeRows$4.apply(WriterContainer.scala:343) at org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer$$anonfun$writeRows$4.apply(WriterContainer.scala:343) at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1278) at org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer.writeRows(WriterContainer.scala:409) ... 8 more Suppressed: java.lang.NullPointerException at parquet.hadoop.InternalParquetRecordWriter.flushRowGroupToStore(InternalParquetRecordWriter.java:152) at parquet.hadoop.InternalParquetRecordWriter.close(InternalParquetRecordWriter.java:111) at parquet.hadoop.ParquetRecordWriter.close(ParquetRecordWriter.java:112) at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.close(ParquetRelation.scala:101) at org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer$$anonfun$writeRows$5.apply$mcV$sp(WriterContainer.scala:411) at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1287) ... 9 moreCaused by:
[jira] [Commented] (HADOOP-17254) Upgrade hbase to 1.4.13 on branch-2.10
[ https://issues.apache.org/jira/browse/HADOOP-17254?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17424953#comment-17424953 ] Masatake Iwasaki commented on HADOOP-17254: --- I pushed the cherry-picking of YARN-8936 to branch-3.2 since I got no issue on {{mvn test}} and {{mvn test -Dhbase.profile=2.0}} under hadoop-yarn-server-timelineservice-hbase. [~brahmareddy] Are you OK to push it to branch-3.2.3 too? > Upgrade hbase to 1.4.13 on branch-2.10 > -- > > Key: HADOOP-17254 > URL: https://issues.apache.org/jira/browse/HADOOP-17254 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Masatake Iwasaki >Assignee: Masatake Iwasaki >Priority: Major > Labels: pull-request-available > Fix For: 2.10.1 > > Time Spent: 50m > Remaining Estimate: 0h > > hbase.version must be updated to address CVE-2018-8025 on branch-2.10. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17254) Upgrade hbase to 1.4.13 on branch-2.10
[ https://issues.apache.org/jira/browse/HADOOP-17254?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17424950#comment-17424950 ] Masatake Iwasaki commented on HADOOP-17254: --- [~ananysin] [~brahmareddy] This issue was filed in the release process of 2.10.1 since YARN-8936 was not applicable to branch-2.10. I did not care about branch-3.2 at that time. I think we should cherry-pick YARN-8936 to branch-3.2 rather than this (HADOOP-17254) if applicable. > Upgrade hbase to 1.4.13 on branch-2.10 > -- > > Key: HADOOP-17254 > URL: https://issues.apache.org/jira/browse/HADOOP-17254 > Project: Hadoop Common > Issue Type: Improvement >Reporter: Masatake Iwasaki >Assignee: Masatake Iwasaki >Priority: Major > Labels: pull-request-available > Fix For: 2.10.1 > > Time Spent: 50m > Remaining Estimate: 0h > > hbase.version must be updated to address CVE-2018-8025 on branch-2.10. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] prasad-acit closed pull request #3491: HDFS-16239. XAttr#toString doesnt print the attribute value in readab…
prasad-acit closed pull request #3491: URL: https://github.com/apache/hadoop/pull/3491 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Work logged] (HADOOP-17952) Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
[ https://issues.apache.org/jira/browse/HADOOP-17952?focusedWorklogId=660845=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-660845 ] ASF GitHub Bot logged work on HADOOP-17952: --- Author: ASF GitHub Bot Created on: 06/Oct/21 10:58 Start Date: 06/Oct/21 10:58 Worklog Time Spent: 10m Work Description: hadoop-yetus commented on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-935970468 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 3s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 3s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 38s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 23m 51s | | trunk passed | | +1 :green_heart: | compile | 22m 58s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 19m 33s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 1m 21s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 56s | | trunk passed | | +1 :green_heart: | javadoc | 3m 15s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 3m 35s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 5m 33s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 39s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 26s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 55s | | the patch passed | | +1 :green_heart: | compile | 24m 32s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 24m 32s | | the patch passed | | +1 :green_heart: | compile | 19m 46s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 19m 46s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 20s | | hadoop-common-project: The patch generated 0 new + 2501 unchanged - 3 fixed = 2501 total (was 2504) | | +1 :green_heart: | mvnsite | 3m 49s | | the patch passed | | +1 :green_heart: | xml | 0m 6s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 3m 11s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 3m 37s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 6m 32s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 26s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 3m 35s | | hadoop-auth in the patch passed. | | +1 :green_heart: | unit | 17m 7s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 0m 50s | | hadoop-nfs in the patch passed. | | +1 :green_heart: | unit | 3m 37s | | hadoop-kms in the patch passed. | | +1 :green_heart: | unit | 1m 23s | | hadoop-registry in the patch passed. | | +1 :green_heart: | asflicense | 1m 7s | | The patch does not generate ASF License warnings. | | | | 245m 3s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3503/8/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3503 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell xml spotbugs checkstyle | | uname | Linux c506492bbb11 4.15.0-147-generic #151-Ubuntu SMP Fri Jun 18 19:21:19 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / fafbd50133e9e43569e92a3c9e694dfd29527be2 | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | |
[GitHub] [hadoop] hadoop-yetus commented on pull request #3503: HADOOP-17952. Replace Guava VisibleForTesting by Hadoop's own annotation in hadoop-common-project modules
hadoop-yetus commented on pull request #3503: URL: https://github.com/apache/hadoop/pull/3503#issuecomment-935970468 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 1m 3s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 3s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +0 :ok: | mvndep | 15m 38s | | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 23m 51s | | trunk passed | | +1 :green_heart: | compile | 22m 58s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 19m 33s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | checkstyle | 1m 21s | | trunk passed | | +1 :green_heart: | mvnsite | 3m 56s | | trunk passed | | +1 :green_heart: | javadoc | 3m 15s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 3m 35s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 5m 33s | | trunk passed | | +1 :green_heart: | shadedclient | 23m 39s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 26s | | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 2m 55s | | the patch passed | | +1 :green_heart: | compile | 24m 32s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javac | 24m 32s | | the patch passed | | +1 :green_heart: | compile | 19m 46s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | javac | 19m 46s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | checkstyle | 1m 20s | | hadoop-common-project: The patch generated 0 new + 2501 unchanged - 3 fixed = 2501 total (was 2504) | | +1 :green_heart: | mvnsite | 3m 49s | | the patch passed | | +1 :green_heart: | xml | 0m 6s | | The patch has no ill-formed XML file. | | +1 :green_heart: | javadoc | 3m 11s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | javadoc | 3m 37s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | spotbugs | 6m 32s | | the patch passed | | +1 :green_heart: | shadedclient | 22m 26s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 3m 35s | | hadoop-auth in the patch passed. | | +1 :green_heart: | unit | 17m 7s | | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 0m 50s | | hadoop-nfs in the patch passed. | | +1 :green_heart: | unit | 3m 37s | | hadoop-kms in the patch passed. | | +1 :green_heart: | unit | 1m 23s | | hadoop-registry in the patch passed. | | +1 :green_heart: | asflicense | 1m 7s | | The patch does not generate ASF License warnings. | | | | 245m 3s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3503/8/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3503 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell xml spotbugs checkstyle | | uname | Linux c506492bbb11 4.15.0-147-generic #151-Ubuntu SMP Fri Jun 18 19:21:19 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / fafbd50133e9e43569e92a3c9e694dfd29527be2 | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3503/8/testReport/ | | Max. process+thread count | 1236 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-auth
[GitHub] [hadoop] hadoop-yetus commented on pull request #3523: HDFS-16251. Make hdfs_cat tool cross platform
hadoop-yetus commented on pull request #3523: URL: https://github.com/apache/hadoop/pull/3523#issuecomment-935934414 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Logfile | Comment | |::|--:|:|::|:---:| | +0 :ok: | reexec | 0m 48s | | Docker mode activated. | _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | | No case conflicting files found. | | +0 :ok: | codespell | 0m 0s | | codespell was not available. | | +1 :green_heart: | @author | 0m 0s | | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | | The patch appears to include 4 new or modified test files. | _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 20m 26s | | trunk passed | | +1 :green_heart: | compile | 2m 49s | | trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | compile | 2m 51s | | trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | mvnsite | 0m 26s | | trunk passed | | +1 :green_heart: | shadedclient | 45m 23s | | branch has no errors when building and testing our client artifacts. | _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 19s | | the patch passed | | +1 :green_heart: | compile | 2m 39s | | the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 | | +1 :green_heart: | cc | 2m 39s | | the patch passed | | +1 :green_heart: | golang | 2m 39s | | the patch passed | | +1 :green_heart: | javac | 2m 39s | | the patch passed | | +1 :green_heart: | compile | 2m 42s | | the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | +1 :green_heart: | cc | 2m 42s | | the patch passed | | +1 :green_heart: | golang | 2m 42s | | the patch passed | | +1 :green_heart: | javac | 2m 42s | | the patch passed | | +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks issues. | | +1 :green_heart: | mvnsite | 0m 18s | | the patch passed | | +1 :green_heart: | shadedclient | 18m 32s | | patch has no errors when building and testing our client artifacts. | _ Other Tests _ | | +1 :green_heart: | unit | 32m 2s | | hadoop-hdfs-native-client in the patch passed. | | +1 :green_heart: | asflicense | 0m 35s | | The patch does not generate ASF License warnings. | | | | 105m 41s | | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3523/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/3523 | | Optional Tests | dupname asflicense compile cc mvnsite javac unit codespell golang | | uname | Linux 0b09ba10a7ce 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | dev-support/bin/hadoop.sh | | git revision | trunk / c57a63fbae71e00a84315643fdccf540f2baf026 | | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 | | Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3523/3/testReport/ | | Max. process+thread count | 717 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: hadoop-hdfs-project/hadoop-hdfs-native-client | | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3523/3/console | | versions | git=2.25.1 maven=3.6.3 | | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org | This message was automatically generated. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17343) Upgrade aws-java-sdk to 1.11.901
[ https://issues.apache.org/jira/browse/HADOOP-17343?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17424892#comment-17424892 ] Ananya Singh commented on HADOOP-17343: --- Can we backport to branch-3.2? > Upgrade aws-java-sdk to 1.11.901 > > > Key: HADOOP-17343 > URL: https://issues.apache.org/jira/browse/HADOOP-17343 > Project: Hadoop Common > Issue Type: Sub-task > Components: build, fs/s3 >Affects Versions: 3.3.1, 3.4.0 >Reporter: Dongjoon Hyun >Assignee: Steve Loughran >Priority: Minor > Labels: pull-request-available > Fix For: 3.3.1 > > Time Spent: 4h > Remaining Estimate: 0h > > Upgrade AWS SDK to most recent version -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17236) Bump up snakeyaml to 1.26 to mitigate CVE-2017-18640
[ https://issues.apache.org/jira/browse/HADOOP-17236?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17424888#comment-17424888 ] Ananya Singh commented on HADOOP-17236: --- Can we backport to branch-3.2? > Bump up snakeyaml to 1.26 to mitigate CVE-2017-18640 > > > Key: HADOOP-17236 > URL: https://issues.apache.org/jira/browse/HADOOP-17236 > Project: Hadoop Common > Issue Type: Bug >Reporter: Brahma Reddy Battula >Assignee: Brahma Reddy Battula >Priority: Major > Fix For: 3.3.1, 3.4.0 > > Attachments: HADOOP-17236-001-tempToRun.patch, HADOOP-17236-001.patch > > > Bump up snakeyaml to 1.26 to mitigate CVE-2017-18640 -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17371) Bump Jetty to the latest version 9.4.35
[ https://issues.apache.org/jira/browse/HADOOP-17371?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17424884#comment-17424884 ] Ananya Singh commented on HADOOP-17371: --- Can we backport this to branch-3.2? > Bump Jetty to the latest version 9.4.35 > --- > > Key: HADOOP-17371 > URL: https://issues.apache.org/jira/browse/HADOOP-17371 > Project: Hadoop Common > Issue Type: Improvement >Affects Versions: 3.3.1, 3.4.0, 3.2.3 >Reporter: Wei-Chiu Chuang >Assignee: Wei-Chiu Chuang >Priority: Major > Labels: pull-request-available > Fix For: 3.3.1, 3.4.0 > > Time Spent: 5h 10m > Remaining Estimate: 0h > > The Hadoop 3 branches are on 9.4.20. We should update to the latest version: > 9.4.34 -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17834) Bump aliyun-sdk-oss to 3.13.0
[ https://issues.apache.org/jira/browse/HADOOP-17834?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17424882#comment-17424882 ] Ananya Singh commented on HADOOP-17834: --- Can we backport this to branch-3.2? > Bump aliyun-sdk-oss to 3.13.0 > - > > Key: HADOOP-17834 > URL: https://issues.apache.org/jira/browse/HADOOP-17834 > Project: Hadoop Common > Issue Type: Task >Reporter: Siyao Meng >Assignee: Siyao Meng >Priority: Major > Labels: pull-request-available > Fix For: 3.4.0, 3.3.2 > > Time Spent: 1h > Remaining Estimate: 0h > > Bump aliyun-sdk-oss to 3.13.0 in order to remove transitive dependency on > jdom 1.1. > Ref: > https://issues.apache.org/jira/browse/HADOOP-17820?focusedCommentId=17390206=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17390206. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org