[GitHub] [hadoop] hadoop-yetus commented on pull request #2051: HDFS-15385 Upgrade boost library
hadoop-yetus commented on pull request #2051: URL: https://github.com/apache/hadoop/pull/2051#issuecomment-638636989 (!) A patch to the testing environment has been detected. Re-executing against the patched versions to perform further tests. The console is at https://builds.apache.org/job/hadoop-multibranch/job/PR-2051/2/console in case of problems. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #2052: HDFS-15386 ReplicaNotFoundException keeps happening in DN after remov…
hadoop-yetus commented on pull request #2052: URL: https://github.com/apache/hadoop/pull/2052#issuecomment-638616022 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Comment | |::|--:|:|:| | +0 :ok: | reexec | 0m 30s | Docker mode activated. | ||| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | The patch appears to include 1 new or modified test files. | ||| _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 19m 1s | trunk passed | | +1 :green_heart: | compile | 1m 9s | trunk passed | | +1 :green_heart: | checkstyle | 0m 51s | trunk passed | | +1 :green_heart: | mvnsite | 1m 19s | trunk passed | | +1 :green_heart: | shadedclient | 15m 55s | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 46s | trunk passed | | +0 :ok: | spotbugs | 3m 1s | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 2m 58s | trunk passed | ||| _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 8s | the patch passed | | +1 :green_heart: | compile | 1m 0s | the patch passed | | +1 :green_heart: | javac | 1m 0s | the patch passed | | -0 :warning: | checkstyle | 0m 39s | hadoop-hdfs-project/hadoop-hdfs: The patch generated 2 new + 124 unchanged - 1 fixed = 126 total (was 125) | | +1 :green_heart: | mvnsite | 1m 10s | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 15m 4s | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 42s | the patch passed | | +1 :green_heart: | findbugs | 3m 14s | the patch passed | ||| _ Other Tests _ | | -1 :x: | unit | 96m 43s | hadoop-hdfs in the patch passed. | | +1 :green_heart: | asflicense | 0m 42s | The patch does not generate ASF License warnings. | | | | 164m 6s | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.namenode.TestNameNodeRetryCacheMetrics | | | hadoop.hdfs.TestReconstructStripedFileWithRandomECPolicy | | | hadoop.hdfs.TestReconstructStripedFile | | | hadoop.hdfs.tools.TestDFSAdminWithHA | | | hadoop.hdfs.TestGetFileChecksum | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://builds.apache.org/job/hadoop-multibranch/job/PR-2052/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2052 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux 0397bc78377d 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | personality/hadoop.sh | | git revision | trunk / 704409d53bf | | Default Java | Private Build-1.8.0_252-8u252-b09-1~18.04-b09 | | checkstyle | https://builds.apache.org/job/hadoop-multibranch/job/PR-2052/1/artifact/out/diff-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt | | unit | https://builds.apache.org/job/hadoop-multibranch/job/PR-2052/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt | | Test Results | https://builds.apache.org/job/hadoop-multibranch/job/PR-2052/1/testReport/ | | Max. process+thread count | 4612 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs | | Console output | https://builds.apache.org/job/hadoop-multibranch/job/PR-2052/1/console | | versions | git=2.17.1 maven=3.6.0 findbugs=3.1.0-RC1 | | Powered by | Apache Yetus 0.12.0 https://yetus.apache.org | This message was automatically generated. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-16254) Add proxy address in IPC connection
[ https://issues.apache.org/jira/browse/HADOOP-16254?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17125543#comment-17125543 ] zhengchenyu commented on HADOOP-16254: -- Additionally, InetSocketAddress is not proper. When set address is /10.10.10.10, the unit-test testProxy will be error. Maybe construct InetAddress like below is proper. {code} proxyAddress = InetAddress.getByName(socks[0]); {code} > Add proxy address in IPC connection > --- > > Key: HADOOP-16254 > URL: https://issues.apache.org/jira/browse/HADOOP-16254 > Project: Hadoop Common > Issue Type: New Feature > Components: ipc >Reporter: Xiaoqiao He >Assignee: Xiaoqiao He >Priority: Major > Attachments: HADOOP-16254.001.patch, HADOOP-16254.002.patch, > HADOOP-16254.004.patch > > > In order to support data locality of RBF, we need to add new field about > client hostname in the RPC headers of Router protocol calls. > clientHostname represents hostname of client and forward by Router to > Namenode to support data locality friendly. See more [RBF Data Locality > Design|https://issues.apache.org/jira/secure/attachment/12965092/RBF%20Data%20Locality%20Design.pdf] > in HDFS-13248 and [maillist > vote|http://mail-archives.apache.org/mod_mbox/hadoop-common-dev/201904.mbox/%3CCAF3Ajax7hGxvowg4K_HVTZeDqC5H=3bfb7mv5sz5mgvadhv...@mail.gmail.com%3E]. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-16254) Add proxy address in IPC connection
[ https://issues.apache.org/jira/browse/HADOOP-16254?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17125541#comment-17125541 ] zhengchenyu commented on HADOOP-16254: -- I debug the code use HADOOP-16254.004.patch. But I found this code 'result.setProxyAddress(proxyAddress);' in makeIpcConnectionContext was called randomly. By review related rpc code, I found on connection will process more than one call to specific namenode, connection was cahed in connections, then reuse by ConnectionId. So if client visit nn1:port twice in little interval, writeConnectionContext will only called in first time. It means we only use the first IpcConnectionContextProto, so ingore proxyaddress. How to trigger this problem? Maybe this problem is not serious for a general hdfs client. But for router, both heartbeatservice and proxy client request will share same connection (means visit same namenode service). Sometimes, heartbeatservice will set IpcConnectionContextProto first, the proxy address will not passed to namenode. > Add proxy address in IPC connection > --- > > Key: HADOOP-16254 > URL: https://issues.apache.org/jira/browse/HADOOP-16254 > Project: Hadoop Common > Issue Type: New Feature > Components: ipc >Reporter: Xiaoqiao He >Assignee: Xiaoqiao He >Priority: Major > Attachments: HADOOP-16254.001.patch, HADOOP-16254.002.patch, > HADOOP-16254.004.patch > > > In order to support data locality of RBF, we need to add new field about > client hostname in the RPC headers of Router protocol calls. > clientHostname represents hostname of client and forward by Router to > Namenode to support data locality friendly. See more [RBF Data Locality > Design|https://issues.apache.org/jira/secure/attachment/12965092/RBF%20Data%20Locality%20Design.pdf] > in HDFS-13248 and [maillist > vote|http://mail-archives.apache.org/mod_mbox/hadoop-common-dev/201904.mbox/%3CCAF3Ajax7hGxvowg4K_HVTZeDqC5H=3bfb7mv5sz5mgvadhv...@mail.gmail.com%3E]. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-13867) FilterFileSystem should override rename(.., options) to take effect of Rename options called via FilterFileSystem implementations
[ https://issues.apache.org/jira/browse/HADOOP-13867?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17125509#comment-17125509 ] Kihwal Lee commented on HADOOP-13867: - I just found out this breaks {{LocalFileSystem}}. If the three argument version of rename() is called against {{LocalFileSystem}}, it ends up calling {{FilterFileSystem}}'s method, which doesn't do the right thing (i.e. the crc files are not renamed). This is because {{ChecksumFileSystem}} does not override this method. When this Jira was done, all subclasses of {{FilterFileSystem}} should have been checked for this kind of side-effect. > FilterFileSystem should override rename(.., options) to take effect of Rename > options called via FilterFileSystem implementations > - > > Key: HADOOP-13867 > URL: https://issues.apache.org/jira/browse/HADOOP-13867 > Project: Hadoop Common > Issue Type: Bug >Reporter: Vinayakumar B >Assignee: Vinayakumar B >Priority: Major > Fix For: 2.9.0, 2.7.4, 3.0.0-alpha2, 2.8.2 > > Attachments: HADOOP-13867-01.patch > > > HDFS-8312 Added Rename.TO_TRASH option to add a security check before moving > to trash. > But for FilterFileSystem implementations since this rename(..options) is not > overridden, it uses default FileSystem implementation where Rename.TO_TRASH > option is not delegated to NameNode. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] brfrn169 opened a new pull request #2052: HDFS-15386 ReplicaNotFoundException keeps happening in DN after remov…
brfrn169 opened a new pull request #2052: URL: https://github.com/apache/hadoop/pull/2052 …ing multiple DN's data directories This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17063) S3ABlockOutputStream.putObject looks stuck and never timeout
[ https://issues.apache.org/jira/browse/HADOOP-17063?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17125413#comment-17125413 ] Dyno commented on HADOOP-17063: --- log in the spark executor, Jun 3, 2020 @ 22:57:23.032 2020-06-03 22:57:23,032 INFO impl.MetricsSystemImpl: s3a-file-system metrics system shutdown complete. Jun 3, 2020 @ 22:57:23.0322020-06-03 22:57:23,032 INFO impl.MetricsSystemImpl: Stopping s3a-file-system metrics system...105 Jun 3, 2020 @ 22:57:23.0322020-06-03 22:57:23,032 INFO impl.MetricsSystemImpl: s3a-file-system metrics system stopped. Jun 3, 2020 @ 22:57:22.9732020-06-03 22:57:22,973 INFO util.ShutdownHookManager: Deleting directory /var/data/spark-ff208630-1fc7-48bc-93a1-6bdf94921c64/spark-eb4613f4-a41a-4985-845b-34b58ae95c50 Jun 3, 2020 @ 22:57:22.9722020-06-03 22:57:22,972 INFO util.ShutdownHookManager: Shutdown hook called Jun 3, 2020 @ 22:57:22.964 2020-06-03 22:57:22,964 INFO storage.DiskBlockManager: Shutdown hook called Jun 3, 2020 @ 22:57:22.960 2020-06-03 22:57:22,960 ERROR executor.CoarseGrainedExecutorBackend: RECEIVED SIGNAL TERM <-- kill to make it move forward Jun 3, 2020 @ 21:52:40.5502020-06-03 21:52:40,549 INFO executor.Executor: Finished task 2927.0 in stage 12.0 (TID 31771). 4696 bytes result sent to driver Jun 3, 2020 @ 21:52:40.5412020-06-03 21:52:40,541 INFO output.FileOutputCommitter: Saved output of task 'attempt_20200603213232_0012_m_002927_31771' to s3a://com Jun 3, 2020 @ 21:52:40.5412020-06-03 21:52:40,541 INFO mapred.SparkHadoopMapRedUtil: attempt_20200603213232_0012_m_002927_31771: Committed Jun 3, 2020 @ 21:52:34.9722020-06-03 21:52:34,971 INFO executor.Executor: Finished task 2922.0 in stage 12.0 (TID 31766). 4696 bytes result sent to driver Jun 3, 2020 @ 21:52:34.9632020-06-03 21:52:34,962 INFO output.FileOutputCommitter: Saved out ... > S3ABlockOutputStream.putObject looks stuck and never timeout > > > Key: HADOOP-17063 > URL: https://issues.apache.org/jira/browse/HADOOP-17063 > Project: Hadoop Common > Issue Type: Bug >Affects Versions: 3.2.1 > Environment: hadoop 3.2.1 > spark 2.4.4 > >Reporter: Dyno >Priority: Major > > {code} > sun.misc.Unsafe.park(Native Method) > java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) > com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:523) > com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:82) > > org.apache.hadoop.fs.s3a.S3ABlockOutputStream.putObject(S3ABlockOutputStream.java:446) > > org.apache.hadoop.fs.s3a.S3ABlockOutputStream.close(S3ABlockOutputStream.java:365) > > org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72) > org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:101) > org.apache.parquet.hadoop.util.HadoopPositionOutputStream.close(HadoopPositionOutputStream.java:64) > org.apache.parquet.hadoop.ParquetFileWriter.end(ParquetFileWriter.java:685) > org.apache.parquet.hadoop.InternalParquetRecordWriter.close(InternalParquetRecordWriter.java:122) > > org.apache.parquet.hadoop.ParquetRecordWriter.close(ParquetRecordWriter.java:165) > > org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.close(ParquetOutputWriter.scala:42) > > org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseResources(FileFormatDataWriter.scala:57) > > org.apache.spark.sql.execution.datasources.FileFormatDataWriter.commit(FileFormatDataWriter.scala:74) > > org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:247) > > org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:242) > > org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394) > > org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:248) > > org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:170) > > org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:169) > org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) > org.apache.spark.scheduler.Task.run(Task.scala:123) > org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) > org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) > org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) > java.util.concurrent.ThreadPoolExecutor.runWorker(Threa
[jira] [Created] (HADOOP-17063) S3ABlockOutputStream.putObject looks stuck and never timeout
Dyno created HADOOP-17063: - Summary: S3ABlockOutputStream.putObject looks stuck and never timeout Key: HADOOP-17063 URL: https://issues.apache.org/jira/browse/HADOOP-17063 Project: Hadoop Common Issue Type: Bug Affects Versions: 3.2.1 Environment: hadoop 3.2.1 spark 2.4.4 Reporter: Dyno {code} sun.misc.Unsafe.park(Native Method) java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:523) com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:82) org.apache.hadoop.fs.s3a.S3ABlockOutputStream.putObject(S3ABlockOutputStream.java:446) org.apache.hadoop.fs.s3a.S3ABlockOutputStream.close(S3ABlockOutputStream.java:365) org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72) org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:101) org.apache.parquet.hadoop.util.HadoopPositionOutputStream.close(HadoopPositionOutputStream.java:64) org.apache.parquet.hadoop.ParquetFileWriter.end(ParquetFileWriter.java:685) org.apache.parquet.hadoop.InternalParquetRecordWriter.close(InternalParquetRecordWriter.java:122) org.apache.parquet.hadoop.ParquetRecordWriter.close(ParquetRecordWriter.java:165) org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.close(ParquetOutputWriter.scala:42) org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseResources(FileFormatDataWriter.scala:57) org.apache.spark.sql.execution.datasources.FileFormatDataWriter.commit(FileFormatDataWriter.scala:74) org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:247) org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:242) org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394) org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:248) org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:170) org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:169) org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) org.apache.spark.scheduler.Task.run(Task.scala:123) org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) java.lang.Thread.run(Thread.java:748) {code} we are using spark 2.4.4 with hadoop 3.2.1 on kubernetes/spark-operator, sometimes we see this hang with the stacktrace above. it looks like the putObject never return, we have to kill the executor to make the job move forward. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #2026: HADOOP-17046. Support downstreams' existing Hadoop-rpc implementations using non-shaded protobuf classes
hadoop-yetus commented on pull request #2026: URL: https://github.com/apache/hadoop/pull/2026#issuecomment-638512027 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Comment | |::|--:|:|:| | +0 :ok: | reexec | 0m 44s | Docker mode activated. | ||| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | No case conflicting files found. | | +0 :ok: | shelldocs | 0m 1s | Shelldocs was not available. | | +0 :ok: | prototool | 0m 0s | prototool was not available. | | +1 :green_heart: | @author | 0m 0s | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | The patch appears to include 23 new or modified test files. | ||| _ trunk Compile Tests _ | | +0 :ok: | mvndep | 0m 34s | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 20m 19s | trunk passed | | +1 :green_heart: | compile | 17m 5s | trunk passed | | +1 :green_heart: | checkstyle | 3m 19s | trunk passed | | +1 :green_heart: | mvnsite | 17m 19s | trunk passed | | +1 :green_heart: | shadedclient | 14m 1s | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 5m 39s | trunk passed | | +0 :ok: | spotbugs | 29m 21s | Used deprecated FindBugs config; considering switching to SpotBugs. | | -1 :x: | findbugs | 29m 16s | root in trunk has 3 extant findbugs warnings. | ||| _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 32s | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 27m 40s | the patch passed | | +1 :green_heart: | compile | 16m 41s | the patch passed | | -1 :x: | cc | 16m 41s | root generated 31 new + 131 unchanged - 31 fixed = 162 total (was 162) | | -1 :x: | javac | 16m 41s | root generated 1 new + 1857 unchanged - 0 fixed = 1858 total (was 1857) | | -0 :warning: | checkstyle | 3m 26s | root: The patch generated 12 new + 1400 unchanged - 13 fixed = 1412 total (was 1413) | | +1 :green_heart: | mvnsite | 17m 29s | the patch passed | | +1 :green_heart: | shellcheck | 0m 1s | There were no new shellcheck issues. | | -1 :x: | whitespace | 0m 1s | The patch has 103 line(s) that end in whitespace. Use git apply --whitespace=fix <>. Refer https://git-scm.com/docs/git-apply | | +1 :green_heart: | xml | 0m 4s | The patch has no ill-formed XML file. | | +1 :green_heart: | shadedclient | 14m 13s | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 5m 38s | the patch passed | | +1 :green_heart: | findbugs | 48m 51s | the patch passed | ||| _ Other Tests _ | | -1 :x: | unit | 604m 55s | root in the patch passed. | | -1 :x: | asflicense | 1m 52s | The patch generated 1 ASF License warnings. | | | | 872m 53s | | | Reason | Tests | |---:|:--| | Failed junit tests | hadoop.hdfs.server.blockmanagement.TestUnderReplicatedBlocks | | | hadoop.hdfs.TestGetFileChecksum | | | hadoop.hdfs.server.namenode.TestAddStripedBlockInFBR | | | hadoop.hdfs.server.datanode.TestBPOfferService | | | hadoop.hdfs.TestReconstructStripedFile | | | hadoop.hdfs.tools.TestDFSAdminWithHA | | | hadoop.hdfs.TestReconstructStripedFileWithRandomECPolicy | | | hadoop.tools.TestDistCpSync | | | hadoop.tools.TestDistCpSystem | | | hadoop.yarn.applications.distributedshell.TestDistributedShell | | | hadoop.yarn.server.resourcemanager.scheduler.fair.TestFairSchedulerPreemption | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://builds.apache.org/job/hadoop-multibranch/job/PR-2026/3/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2026 | | Optional Tests | dupname asflicense shellcheck shelldocs xml compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle cc prototool | | uname | Linux fe8dfdd17c03 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | personality/hadoop.sh | | git revision | trunk / e8cb2ae409b | | Default Java | Private Build-1.8.0_252-8u252-b09-1~18.04-b09 | | findbugs | https://builds.apache.org/job/hadoop-multibranch/job/PR-2026/3/artifact/out/branch-findbugs-root-warnings.html | | cc | https://builds.apache.org/job/hadoop-multibranch/job/PR-2026/3/artifact/out/diff-compile-cc-root.txt | | javac | https://builds.apache.org/job/hadoop-multibranch/job/PR-2026/3/artifact/out/diff-compile-javac-root.txt | | checkstyle | https://builds.apache.org/job/hadoop-multibranch/job/PR-2026/3/artifact/out/diff-checkstyle-root.txt |
[jira] [Commented] (HADOOP-17029) ViewFS does not return correct user/group and ACL
[ https://issues.apache.org/jira/browse/HADOOP-17029?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17125378#comment-17125378 ] Uma Maheswara Rao G commented on HADOOP-17029: -- Thank you [~abhishekd] for the reply! No worries. Thanks > ViewFS does not return correct user/group and ACL > - > > Key: HADOOP-17029 > URL: https://issues.apache.org/jira/browse/HADOOP-17029 > Project: Hadoop Common > Issue Type: Bug > Components: fs, viewfs >Reporter: Abhishek Das >Assignee: Abhishek Das >Priority: Major > > When doing ls on a mount point parent, the returned user/group ACL is > incorrect. It always showing the user and group being current user, with some > arbitrary ACL. Which could misleading any application depending on this API. > cc [~cliang] [~virajith] -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17056) shelldoc fails in hadoop-common
[ https://issues.apache.org/jira/browse/HADOOP-17056?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17125343#comment-17125343 ] Hadoop QA commented on HADOOP-17056: | (/) *{color:green}+1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || | {color:blue}0{color} | {color:blue} reexec {color} | {color:blue} 2m 52s{color} | {color:blue} Docker mode activated. {color} | || || || || {color:brown} Prechecks {color} || | {color:green}+1{color} | {color:green} dupname {color} | {color:green} 0m 0s{color} | {color:green} No case conflicting files found. {color} | | {color:blue}0{color} | {color:blue} shelldocs {color} | {color:blue} 0m 0s{color} | {color:blue} Shelldocs was not available. {color} | | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} trunk Compile Tests {color} || | {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 17m 56s{color} | {color:green} branch has no errors when building and testing our client artifacts. {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:green}+1{color} | {color:green} hadolint {color} | {color:green} 0m 6s{color} | {color:green} There were no new hadolint issues. {color} | | {color:green}+1{color} | {color:green} shellcheck {color} | {color:green} 0m 0s{color} | {color:green} There were no new shellcheck issues. {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s{color} | {color:green} The patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 18m 4s{color} | {color:green} patch has no errors when building and testing our client artifacts. {color} | || || || || {color:brown} Other Tests {color} || | {color:green}+1{color} | {color:green} asflicense {color} | {color:green} 0m 40s{color} | {color:green} The patch does not generate ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 40m 29s{color} | {color:black} {color} | \\ \\ || Subsystem || Report/Notes || | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://builds.apache.org/job/PreCommit-HADOOP-Build/16968/artifact/out/Dockerfile | | JIRA Issue | HADOOP-17056 | | JIRA Patch URL | https://issues.apache.org/jira/secure/attachment/13004757/HADOOP-17056-addendum.01.patch | | Optional Tests | dupname asflicense hadolint shellcheck shelldocs | | uname | Linux 7e129cd81229 4.15.0-74-generic #84-Ubuntu SMP Thu Dec 19 08:06:28 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | personality/hadoop.sh | | git revision | trunk / 40d63e02f04 | | Max. process+thread count | 301 (vs. ulimit of 5500) | | modules | C: . U: . | | Console output | https://builds.apache.org/job/PreCommit-HADOOP-Build/16968/console | | versions | git=2.17.1 maven=3.6.0 shellcheck=0.4.6 hadolint=1.11.1-0-g0e692dd | | Powered by | Apache Yetus 0.12.0 https://yetus.apache.org | This message was automatically generated. > shelldoc fails in hadoop-common > --- > > Key: HADOOP-17056 > URL: https://issues.apache.org/jira/browse/HADOOP-17056 > Project: Hadoop Common > Issue Type: Bug > Components: build >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Major > Fix For: 3.3.1, 3.4.0 > > Attachments: 2040.02.patch, 2040.03.patch, 2040.patch, > HADOOP-17056-addendum.01.patch, HADOOP-17056-test-01.patch, > HADOOP-17056-test-02.patch, HADOOP-17056-test-03.patch, HADOOP-17056.01.patch > > > {noformat} > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > > ERROR: yetus-dl: gpg unable to import > > /home/jenkins/jenkins-slave/workspace/PreCommit-HADOOP-Build/sourcedir/patchprocess/KEYS_YETUS > > [INFO] > > > > [INFO] BUILD FAILURE > > [INFO] > > > > [INFO] Total time: 9.377 s > > [INFO] Finished at: 2020-05-28T17:37:41Z > > [INFO] > > > > [ERROR] Failed to execute goal > > org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (shelldocs) on project > > hadoop-common: Command execution failed. Process exited with an error: 1 > > (Exit value: 1) -> [Help 1] > > [ERROR] > > [ERROR] To see the full stack trace of the errors, re-run Maven with the > > -e switch. > > [ERROR] Re-run Maven using the -X switch to enable full debug logging. > > [ERROR] > > [ERROR] For more information about the errors and possible solutions, > > please read the following articles: > > [ERROR] [Help 1] > > http://
[jira] [Commented] (HADOOP-17062) Fix shelldocs path in Jenkinsfile
[ https://issues.apache.org/jira/browse/HADOOP-17062?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17125336#comment-17125336 ] Hudson commented on HADOOP-17062: - SUCCESS: Integrated in Jenkins build Hadoop-trunk-Commit #18325 (See [https://builds.apache.org/job/Hadoop-trunk-Commit/18325/]) HADOOP-17062. Fix shelldocs path in Jenkinsfile (#2049) (github: rev 704409d53bf7ebf717a3c2e988ede80f623bbad3) * (edit) Jenkinsfile > Fix shelldocs path in Jenkinsfile > - > > Key: HADOOP-17062 > URL: https://issues.apache.org/jira/browse/HADOOP-17062 > Project: Hadoop Common > Issue Type: Bug > Components: build >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Major > Fix For: 2.9.3, 3.2.2, 2.10.1, 3.3.1, 3.4.0, 3.1.5 > > > Shelldocs check is not enabled in the precommit jobs. > |{color:#FF}0{color}|{color:#FF}shelldocs{color}|{color:#FF}0m > 1s{color}|{color:#FF}Shelldocs was not available.{color}| > Console log > https://builds.apache.org/job/hadoop-multibranch/job/PR-2045/1/console > {noformat} > WARNING: shellcheck needs UTF-8 locale support. Forcing C.UTF-8. > executable '/testptch/hadoop/dev-support/bin/shelldocs' for 'shelldocs' does > not exist. > {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #2046: HADOOP-16202 Enhance S3A openFile()
hadoop-yetus commented on pull request #2046: URL: https://github.com/apache/hadoop/pull/2046#issuecomment-638468900 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Comment | |::|--:|:|:| | +0 :ok: | reexec | 0m 27s | Docker mode activated. | ||| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | No case conflicting files found. | | +0 :ok: | markdownlint | 0m 0s | markdownlint was not available. | | +1 :green_heart: | @author | 0m 0s | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | The patch appears to include 3 new or modified test files. | ||| _ trunk Compile Tests _ | | +0 :ok: | mvndep | 0m 23s | Maven dependency ordering for branch | | +1 :green_heart: | mvninstall | 20m 41s | trunk passed | | +1 :green_heart: | compile | 17m 31s | trunk passed | | +1 :green_heart: | checkstyle | 2m 48s | trunk passed | | +1 :green_heart: | mvnsite | 2m 5s | trunk passed | | +1 :green_heart: | shadedclient | 20m 26s | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 1m 27s | trunk passed | | +0 :ok: | spotbugs | 1m 7s | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 3m 9s | trunk passed | ||| _ Patch Compile Tests _ | | +0 :ok: | mvndep | 0m 22s | Maven dependency ordering for patch | | +1 :green_heart: | mvninstall | 1m 21s | the patch passed | | +1 :green_heart: | compile | 16m 52s | the patch passed | | +1 :green_heart: | javac | 16m 52s | the patch passed | | +1 :green_heart: | checkstyle | 2m 49s | the patch passed | | +1 :green_heart: | mvnsite | 2m 7s | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 15m 24s | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 1m 29s | the patch passed | | +1 :green_heart: | findbugs | 3m 28s | the patch passed | ||| _ Other Tests _ | | +1 :green_heart: | unit | 9m 19s | hadoop-common in the patch passed. | | +1 :green_heart: | unit | 1m 28s | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 44s | The patch does not generate ASF License warnings. | | | | 123m 5s | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://builds.apache.org/job/hadoop-multibranch/job/PR-2046/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2046 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle markdownlint | | uname | Linux 643ca18e831c 4.15.0-101-generic #102-Ubuntu SMP Mon May 11 10:07:26 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | personality/hadoop.sh | | git revision | trunk / 40d63e02f04 | | Default Java | Private Build-1.8.0_252-8u252-b09-1~18.04-b09 | | Test Results | https://builds.apache.org/job/hadoop-multibranch/job/PR-2046/2/testReport/ | | Max. process+thread count | 1332 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws U: . | | Console output | https://builds.apache.org/job/hadoop-multibranch/job/PR-2046/2/console | | versions | git=2.17.1 maven=3.6.0 findbugs=3.1.0-RC1 | | Powered by | Apache Yetus 0.12.0 https://yetus.apache.org | This message was automatically generated. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17062) Fix shelldocs path in Jenkinsfile
[ https://issues.apache.org/jira/browse/HADOOP-17062?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Akira Ajisaka updated HADOOP-17062: --- Fix Version/s: 3.1.5 3.4.0 3.3.1 2.10.1 3.2.2 2.9.3 Resolution: Fixed Status: Resolved (was: Patch Available) Committed to all the active branches. > Fix shelldocs path in Jenkinsfile > - > > Key: HADOOP-17062 > URL: https://issues.apache.org/jira/browse/HADOOP-17062 > Project: Hadoop Common > Issue Type: Bug > Components: build >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Major > Fix For: 2.9.3, 3.2.2, 2.10.1, 3.3.1, 3.4.0, 3.1.5 > > > Shelldocs check is not enabled in the precommit jobs. > |{color:#FF}0{color}|{color:#FF}shelldocs{color}|{color:#FF}0m > 1s{color}|{color:#FF}Shelldocs was not available.{color}| > Console log > https://builds.apache.org/job/hadoop-multibranch/job/PR-2045/1/console > {noformat} > WARNING: shellcheck needs UTF-8 locale support. Forcing C.UTF-8. > executable '/testptch/hadoop/dev-support/bin/shelldocs' for 'shelldocs' does > not exist. > {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17062) Fix shelldocs path in Jenkinsfile
[ https://issues.apache.org/jira/browse/HADOOP-17062?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Akira Ajisaka updated HADOOP-17062: --- Summary: Fix shelldocs path in Jenkinsfile (was: Fix "shelldocs was not available" warning in the precommit job) > Fix shelldocs path in Jenkinsfile > - > > Key: HADOOP-17062 > URL: https://issues.apache.org/jira/browse/HADOOP-17062 > Project: Hadoop Common > Issue Type: Bug > Components: build >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Major > > Shelldocs check is not enabled in the precommit jobs. > |{color:#FF}0{color}|{color:#FF}shelldocs{color}|{color:#FF}0m > 1s{color}|{color:#FF}Shelldocs was not available.{color}| > Console log > https://builds.apache.org/job/hadoop-multibranch/job/PR-2045/1/console > {noformat} > WARNING: shellcheck needs UTF-8 locale support. Forcing C.UTF-8. > executable '/testptch/hadoop/dev-support/bin/shelldocs' for 'shelldocs' does > not exist. > {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka merged pull request #2049: HADOOP-17062. Fix shelldocs path in Jenkinsfile
aajisaka merged pull request #2049: URL: https://github.com/apache/hadoop/pull/2049 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on pull request #2049: HADOOP-17062. Fix shelldocs path in Jenkinsfile
aajisaka commented on pull request #2049: URL: https://github.com/apache/hadoop/pull/2049#issuecomment-638460715 Thank you, @iwasakims and @ayushtkn This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17056) shelldoc fails in hadoop-common
[ https://issues.apache.org/jira/browse/HADOOP-17056?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17125320#comment-17125320 ] Ayush Saxena commented on HADOOP-17056: --- +1 for addendum v01 patch. Thanx for taking care > shelldoc fails in hadoop-common > --- > > Key: HADOOP-17056 > URL: https://issues.apache.org/jira/browse/HADOOP-17056 > Project: Hadoop Common > Issue Type: Bug > Components: build >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Major > Fix For: 3.3.1, 3.4.0 > > Attachments: 2040.02.patch, 2040.03.patch, 2040.patch, > HADOOP-17056-addendum.01.patch, HADOOP-17056-test-01.patch, > HADOOP-17056-test-02.patch, HADOOP-17056-test-03.patch, HADOOP-17056.01.patch > > > {noformat} > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > > ERROR: yetus-dl: gpg unable to import > > /home/jenkins/jenkins-slave/workspace/PreCommit-HADOOP-Build/sourcedir/patchprocess/KEYS_YETUS > > [INFO] > > > > [INFO] BUILD FAILURE > > [INFO] > > > > [INFO] Total time: 9.377 s > > [INFO] Finished at: 2020-05-28T17:37:41Z > > [INFO] > > > > [ERROR] Failed to execute goal > > org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (shelldocs) on project > > hadoop-common: Command execution failed. Process exited with an error: 1 > > (Exit value: 1) -> [Help 1] > > [ERROR] > > [ERROR] To see the full stack trace of the errors, re-run Maven with the > > -e switch. > > [ERROR] Re-run Maven using the -X switch to enable full debug logging. > > [ERROR] > > [ERROR] For more information about the errors and possible solutions, > > please read the following articles: > > [ERROR] [Help 1] > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException > {noformat} > * > https://builds.apache.org/job/PreCommit-HADOOP-Build/16957/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/155/artifact/out/patch-mvnsite-root.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/157/artifact/out/patch-mvnsite-root.txt -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on pull request #2051: HDFS-15385 Upgrade boost library
aajisaka commented on pull request #2051: URL: https://github.com/apache/hadoop/pull/2051#issuecomment-638459689 > Could someone with access to the CI machines please install the latest version of boost by running the following command The CI job runs on docker container and the dockerfile is at `dev-support/docker/Dockerfile`. You can install the required libraries there. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17056) shelldoc fails in hadoop-common
[ https://issues.apache.org/jira/browse/HADOOP-17056?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17125311#comment-17125311 ] Akira Ajisaka commented on HADOOP-17056: I found a typo while backporting this patch. Attached an addendum patch to fix the typo. > shelldoc fails in hadoop-common > --- > > Key: HADOOP-17056 > URL: https://issues.apache.org/jira/browse/HADOOP-17056 > Project: Hadoop Common > Issue Type: Bug > Components: build >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Major > Attachments: 2040.02.patch, 2040.03.patch, 2040.patch, > HADOOP-17056-addendum.01.patch, HADOOP-17056-test-01.patch, > HADOOP-17056-test-02.patch, HADOOP-17056-test-03.patch, HADOOP-17056.01.patch > > > {noformat} > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > > ERROR: yetus-dl: gpg unable to import > > /home/jenkins/jenkins-slave/workspace/PreCommit-HADOOP-Build/sourcedir/patchprocess/KEYS_YETUS > > [INFO] > > > > [INFO] BUILD FAILURE > > [INFO] > > > > [INFO] Total time: 9.377 s > > [INFO] Finished at: 2020-05-28T17:37:41Z > > [INFO] > > > > [ERROR] Failed to execute goal > > org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (shelldocs) on project > > hadoop-common: Command execution failed. Process exited with an error: 1 > > (Exit value: 1) -> [Help 1] > > [ERROR] > > [ERROR] To see the full stack trace of the errors, re-run Maven with the > > -e switch. > > [ERROR] Re-run Maven using the -X switch to enable full debug logging. > > [ERROR] > > [ERROR] For more information about the errors and possible solutions, > > please read the following articles: > > [ERROR] [Help 1] > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException > {noformat} > * > https://builds.apache.org/job/PreCommit-HADOOP-Build/16957/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/155/artifact/out/patch-mvnsite-root.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/157/artifact/out/patch-mvnsite-root.txt -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17056) shelldoc fails in hadoop-common
[ https://issues.apache.org/jira/browse/HADOOP-17056?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Akira Ajisaka updated HADOOP-17056: --- Attachment: HADOOP-17056-addendum.01.patch > shelldoc fails in hadoop-common > --- > > Key: HADOOP-17056 > URL: https://issues.apache.org/jira/browse/HADOOP-17056 > Project: Hadoop Common > Issue Type: Bug > Components: build >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Major > Attachments: 2040.02.patch, 2040.03.patch, 2040.patch, > HADOOP-17056-addendum.01.patch, HADOOP-17056-test-01.patch, > HADOOP-17056-test-02.patch, HADOOP-17056-test-03.patch, HADOOP-17056.01.patch > > > {noformat} > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > > ERROR: yetus-dl: gpg unable to import > > /home/jenkins/jenkins-slave/workspace/PreCommit-HADOOP-Build/sourcedir/patchprocess/KEYS_YETUS > > [INFO] > > > > [INFO] BUILD FAILURE > > [INFO] > > > > [INFO] Total time: 9.377 s > > [INFO] Finished at: 2020-05-28T17:37:41Z > > [INFO] > > > > [ERROR] Failed to execute goal > > org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (shelldocs) on project > > hadoop-common: Command execution failed. Process exited with an error: 1 > > (Exit value: 1) -> [Help 1] > > [ERROR] > > [ERROR] To see the full stack trace of the errors, re-run Maven with the > > -e switch. > > [ERROR] Re-run Maven using the -X switch to enable full debug logging. > > [ERROR] > > [ERROR] For more information about the errors and possible solutions, > > please read the following articles: > > [ERROR] [Help 1] > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException > {noformat} > * > https://builds.apache.org/job/PreCommit-HADOOP-Build/16957/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/155/artifact/out/patch-mvnsite-root.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/157/artifact/out/patch-mvnsite-root.txt -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17056) shelldoc fails in hadoop-common
[ https://issues.apache.org/jira/browse/HADOOP-17056?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Akira Ajisaka updated HADOOP-17056: --- Fix Version/s: 3.4.0 3.3.1 > shelldoc fails in hadoop-common > --- > > Key: HADOOP-17056 > URL: https://issues.apache.org/jira/browse/HADOOP-17056 > Project: Hadoop Common > Issue Type: Bug > Components: build >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Major > Fix For: 3.3.1, 3.4.0 > > Attachments: 2040.02.patch, 2040.03.patch, 2040.patch, > HADOOP-17056-addendum.01.patch, HADOOP-17056-test-01.patch, > HADOOP-17056-test-02.patch, HADOOP-17056-test-03.patch, HADOOP-17056.01.patch > > > {noformat} > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > > ERROR: yetus-dl: gpg unable to import > > /home/jenkins/jenkins-slave/workspace/PreCommit-HADOOP-Build/sourcedir/patchprocess/KEYS_YETUS > > [INFO] > > > > [INFO] BUILD FAILURE > > [INFO] > > > > [INFO] Total time: 9.377 s > > [INFO] Finished at: 2020-05-28T17:37:41Z > > [INFO] > > > > [ERROR] Failed to execute goal > > org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (shelldocs) on project > > hadoop-common: Command execution failed. Process exited with an error: 1 > > (Exit value: 1) -> [Help 1] > > [ERROR] > > [ERROR] To see the full stack trace of the errors, re-run Maven with the > > -e switch. > > [ERROR] Re-run Maven using the -X switch to enable full debug logging. > > [ERROR] > > [ERROR] For more information about the errors and possible solutions, > > please read the following articles: > > [ERROR] [Help 1] > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException > {noformat} > * > https://builds.apache.org/job/PreCommit-HADOOP-Build/16957/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/155/artifact/out/patch-mvnsite-root.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/157/artifact/out/patch-mvnsite-root.txt -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17056) shelldoc fails in hadoop-common
[ https://issues.apache.org/jira/browse/HADOOP-17056?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17125316#comment-17125316 ] Hadoop QA commented on HADOOP-17056: (!) A patch to the testing environment has been detected. Re-executing against the patched versions to perform further tests. The console is at https://builds.apache.org/job/PreCommit-HADOOP-Build/16968/console in case of problems. > shelldoc fails in hadoop-common > --- > > Key: HADOOP-17056 > URL: https://issues.apache.org/jira/browse/HADOOP-17056 > Project: Hadoop Common > Issue Type: Bug > Components: build >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Major > Fix For: 3.3.1, 3.4.0 > > Attachments: 2040.02.patch, 2040.03.patch, 2040.patch, > HADOOP-17056-addendum.01.patch, HADOOP-17056-test-01.patch, > HADOOP-17056-test-02.patch, HADOOP-17056-test-03.patch, HADOOP-17056.01.patch > > > {noformat} > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > > ERROR: yetus-dl: gpg unable to import > > /home/jenkins/jenkins-slave/workspace/PreCommit-HADOOP-Build/sourcedir/patchprocess/KEYS_YETUS > > [INFO] > > > > [INFO] BUILD FAILURE > > [INFO] > > > > [INFO] Total time: 9.377 s > > [INFO] Finished at: 2020-05-28T17:37:41Z > > [INFO] > > > > [ERROR] Failed to execute goal > > org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (shelldocs) on project > > hadoop-common: Command execution failed. Process exited with an error: 1 > > (Exit value: 1) -> [Help 1] > > [ERROR] > > [ERROR] To see the full stack trace of the errors, re-run Maven with the > > -e switch. > > [ERROR] Re-run Maven using the -X switch to enable full debug logging. > > [ERROR] > > [ERROR] For more information about the errors and possible solutions, > > please read the following articles: > > [ERROR] [Help 1] > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException > {noformat} > * > https://builds.apache.org/job/PreCommit-HADOOP-Build/16957/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/155/artifact/out/patch-mvnsite-root.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/157/artifact/out/patch-mvnsite-root.txt -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17059) ArrayIndexOfboundsException in ViewFileSystem#listStatus
[ https://issues.apache.org/jira/browse/HADOOP-17059?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17125254#comment-17125254 ] Hadoop QA commented on HADOOP-17059: | (/) *{color:green}+1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || | {color:blue}0{color} | {color:blue} reexec {color} | {color:blue} 1m 24s{color} | {color:blue} Docker mode activated. {color} | || || || || {color:brown} Prechecks {color} || | {color:green}+1{color} | {color:green} dupname {color} | {color:green} 0m 0s{color} | {color:green} No case conflicting files found. {color} | | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | | {color:green}+1{color} | {color:green} test4tests {color} | {color:green} 0m 0s{color} | {color:green} The patch appears to include 1 new or modified test files. {color} | || || || || {color:brown} trunk Compile Tests {color} || | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 21m 56s{color} | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 18m 22s{color} | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 0m 48s{color} | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} mvnsite {color} | {color:green} 1m 23s{color} | {color:green} trunk passed {color} | | {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 17m 51s{color} | {color:green} branch has no errors when building and testing our client artifacts. {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 0m 55s{color} | {color:green} trunk passed {color} | | {color:blue}0{color} | {color:blue} spotbugs {color} | {color:blue} 2m 12s{color} | {color:blue} Used deprecated FindBugs config; considering switching to SpotBugs. {color} | | {color:green}+1{color} | {color:green} findbugs {color} | {color:green} 2m 10s{color} | {color:green} trunk passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 0m 52s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 18m 41s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 18m 41s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 0m 49s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} mvnsite {color} | {color:green} 1m 23s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s{color} | {color:green} The patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 16m 4s{color} | {color:green} patch has no errors when building and testing our client artifacts. {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 0s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} findbugs {color} | {color:green} 2m 34s{color} | {color:green} the patch passed {color} | || || || || {color:brown} Other Tests {color} || | {color:green}+1{color} | {color:green} unit {color} | {color:green} 10m 1s{color} | {color:green} hadoop-common in the patch passed. {color} | | {color:green}+1{color} | {color:green} asflicense {color} | {color:green} 0m 47s{color} | {color:green} The patch does not generate ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black}116m 53s{color} | {color:black} {color} | \\ \\ || Subsystem || Report/Notes || | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://builds.apache.org/job/PreCommit-HADOOP-Build/16967/artifact/out/Dockerfile | | JIRA Issue | HADOOP-17059 | | JIRA Patch URL | https://issues.apache.org/jira/secure/attachment/13004751/HADOOP-17059.001.patch | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux 8acbfd543fc9 4.15.0-74-generic #84-Ubuntu SMP Thu Dec 19 08:06:28 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | personality/hadoop.sh | | git revision | trunk / 40d63e02f04 | | Default Java | Private Build-1.8.0_252-8u252-b09-1~18.04-b09 | | Test Results | https://builds.apache.org/job/PreCommit-HADOOP-Build/16967/testReport/ | | Max. process+thread count | 3031 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https:
[jira] [Updated] (HADOOP-17016) Adding Common Counters in ABFS
[ https://issues.apache.org/jira/browse/HADOOP-17016?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Steve Loughran updated HADOOP-17016: Fix Version/s: (was: 3.4.0) 3.3.1 > Adding Common Counters in ABFS > -- > > Key: HADOOP-17016 > URL: https://issues.apache.org/jira/browse/HADOOP-17016 > Project: Hadoop Common > Issue Type: Improvement > Components: fs/azure >Affects Versions: 3.3.0 >Reporter: Mehakmeet Singh >Assignee: Mehakmeet Singh >Priority: Major > Fix For: 3.3.1 > > > Common Counters to be added to ABFS: > |OP_CREATE| > |OP_OPEN| > |OP_GET_FILE_STATUS| > |OP_APPEND| > |OP_CREATE_NON_RECURSIVE| > |OP_DELETE| > |OP_EXISTS| > |OP_GET_DELEGATION_TOKEN| > |OP_LIST_STATUS| > |OP_MKDIRS| > |OP_RENAME| > |DIRECTORIES_CREATED| > |DIRECTORIES_DELETED| > |FILES_CREATED| > |FILES_DELETED| > |ERROR_IGNORED| > propose: > * Have an enum class to define all the counters. > * Have an Instrumentation class for making a MetricRegistry and adding all > the counters. > * Incrementing the counters in AzureBlobFileSystem. > * Integration and Unit tests to validate the counters. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran closed pull request #2050: Backport HADOOP-17016.
steveloughran closed pull request #2050: URL: https://github.com/apache/hadoop/pull/2050 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #2050: Backport HADOOP-17016.
steveloughran commented on pull request #2050: URL: https://github.com/apache/hadoop/pull/2050#issuecomment-638399922 +1, cherrypicked patch in trunk into 3.3 now you've tested it thanks! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] piotte13 commented on pull request #2048: HADOOP-17061. Fix broken links in AWS documentation.
piotte13 commented on pull request #2048: URL: https://github.com/apache/hadoop/pull/2048#issuecomment-638370429 Makes sense. Although, when I google for the aws documentation, the Md files are the first to appear, quite confusing. We can close this PR then. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #2051: HDFS-15385 Upgrade boost library
hadoop-yetus commented on pull request #2051: URL: https://github.com/apache/hadoop/pull/2051#issuecomment-638344405 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Comment | |::|--:|:|:| | +0 :ok: | reexec | 0m 37s | Docker mode activated. | ||| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 9s | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | ||| _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 19m 13s | trunk passed | | +1 :green_heart: | compile | 1m 51s | trunk passed | | +1 :green_heart: | mvnsite | 0m 26s | trunk passed | | +1 :green_heart: | shadedclient | 36m 6s | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 23s | trunk passed | ||| _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 16s | the patch passed | | -1 :x: | compile | 0m 21s | hadoop-hdfs-native-client in the patch failed. | | -1 :x: | cc | 0m 21s | hadoop-hdfs-native-client in the patch failed. | | -1 :x: | golang | 0m 21s | hadoop-hdfs-native-client in the patch failed. | | -1 :x: | javac | 0m 21s | hadoop-hdfs-native-client in the patch failed. | | +1 :green_heart: | mvnsite | 0m 16s | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 13m 42s | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 18s | the patch passed | ||| _ Other Tests _ | | -1 :x: | unit | 0m 26s | hadoop-hdfs-native-client in the patch failed. | | +1 :green_heart: | asflicense | 0m 33s | The patch does not generate ASF License warnings. | | | | 57m 2s | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://builds.apache.org/job/hadoop-multibranch/job/PR-2051/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2051 | | Optional Tests | dupname asflicense compile cc mvnsite javac unit golang javadoc mvninstall shadedclient | | uname | Linux 216787dedfe4 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | personality/hadoop.sh | | git revision | trunk / 40d63e02f04 | | Default Java | Private Build-1.8.0_252-8u252-b09-1~18.04-b09 | | compile | https://builds.apache.org/job/hadoop-multibranch/job/PR-2051/1/artifact/out/patch-compile-hadoop-hdfs-project_hadoop-hdfs-native-client.txt | | cc | https://builds.apache.org/job/hadoop-multibranch/job/PR-2051/1/artifact/out/patch-compile-hadoop-hdfs-project_hadoop-hdfs-native-client.txt | | golang | https://builds.apache.org/job/hadoop-multibranch/job/PR-2051/1/artifact/out/patch-compile-hadoop-hdfs-project_hadoop-hdfs-native-client.txt | | javac | https://builds.apache.org/job/hadoop-multibranch/job/PR-2051/1/artifact/out/patch-compile-hadoop-hdfs-project_hadoop-hdfs-native-client.txt | | unit | https://builds.apache.org/job/hadoop-multibranch/job/PR-2051/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-native-client.txt | | Test Results | https://builds.apache.org/job/hadoop-multibranch/job/PR-2051/1/testReport/ | | Max. process+thread count | 413 (vs. ulimit of 5500) | | modules | C: hadoop-hdfs-project/hadoop-hdfs-native-client U: hadoop-hdfs-project/hadoop-hdfs-native-client | | Console output | https://builds.apache.org/job/hadoop-multibranch/job/PR-2051/1/console | | versions | git=2.17.1 maven=3.6.0 | | Powered by | Apache Yetus 0.12.0 https://yetus.apache.org | This message was automatically generated. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17059) ArrayIndexOfboundsException in ViewFileSystem#listStatus
[ https://issues.apache.org/jira/browse/HADOOP-17059?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] hemanthboyina updated HADOOP-17059: --- Attachment: HADOOP-17059.001.patch Status: Patch Available (was: Open) > ArrayIndexOfboundsException in ViewFileSystem#listStatus > > > Key: HADOOP-17059 > URL: https://issues.apache.org/jira/browse/HADOOP-17059 > Project: Hadoop Common > Issue Type: Bug >Reporter: hemanthboyina >Assignee: hemanthboyina >Priority: Major > Attachments: HADOOP-17059.001.patch > > > In Viewfilesystem#listStatus , we get groupnames of ugi , If groupnames > doesn't exists it will throw AIOBE > {code:java} > else { > result[i++] = new FileStatus(0, true, 0, 0, > creationTime, creationTime, PERMISSION_555, > ugi.getShortUserName(), ugi.getGroupNames()[0], > new Path(inode.fullPath).makeQualified( > myUri, null)); > } {code} > -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] GauthamBanasandra opened a new pull request #2051: HDFS-15385 Upgrade boost library
GauthamBanasandra opened a new pull request #2051: URL: https://github.com/apache/hadoop/pull/2051 ## NOTICE Please create an issue in ASF JIRA before opening a pull request, and you need to set the title of the pull request which starts with the corresponding JIRA issue number. (e.g. HADOOP-X. Fix a typo in YYY.) For more details, please see https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-16568) S3A FullCredentialsTokenBinding fails if local credentials are unset
[ https://issues.apache.org/jira/browse/HADOOP-16568?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17125100#comment-17125100 ] Hudson commented on HADOOP-16568: - SUCCESS: Integrated in Jenkins build Hadoop-trunk-Commit #18324 (See [https://builds.apache.org/job/Hadoop-trunk-Commit/18324/]) HADOOP-16568. S3A FullCredentialsTokenBinding fails if local credentials (github: rev 40d63e02f04fb7477e25dd8ef4533da27a4229e3) * (edit) hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/auth/delegation/FullCredentialsTokenBinding.java > S3A FullCredentialsTokenBinding fails if local credentials are unset > > > Key: HADOOP-16568 > URL: https://issues.apache.org/jira/browse/HADOOP-16568 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.3.0 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Fix For: 3.3.1 > > > Not sure how this slipped by the automated tests, but it is happening on my > CLI. > # FullCredentialsTokenBinding fails on startup if there are now AWS keys in > the auth chain > # because it tries to load them in serviceStart, not deployUnbonded -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Resolved] (HADOOP-16568) S3A FullCredentialsTokenBinding fails if local credentials are unset
[ https://issues.apache.org/jira/browse/HADOOP-16568?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Steve Loughran resolved HADOOP-16568. - Fix Version/s: 3.3.1 Resolution: Fixed > S3A FullCredentialsTokenBinding fails if local credentials are unset > > > Key: HADOOP-16568 > URL: https://issues.apache.org/jira/browse/HADOOP-16568 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.3.0 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Fix For: 3.3.1 > > > Not sure how this slipped by the automated tests, but it is happening on my > CLI. > # FullCredentialsTokenBinding fails on startup if there are now AWS keys in > the auth chain > # because it tries to load them in serviceStart, not deployUnbonded -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #1441: HADOOP-16568. S3A FullCredentialsTokenBinding fails if local credentials are unset
steveloughran commented on pull request #1441: URL: https://github.com/apache/hadoop/pull/1441#issuecomment-638296386 thx. PS: larry -when are you going to get an interesting icon? I can give you a choice of urban street art if you so desire This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran merged pull request #1441: HADOOP-16568. S3A FullCredentialsTokenBinding fails if local credentials are unset
steveloughran merged pull request #1441: URL: https://github.com/apache/hadoop/pull/1441 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #2050: Backport HADOOP-17016.
hadoop-yetus commented on pull request #2050: URL: https://github.com/apache/hadoop/pull/2050#issuecomment-638250989 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Comment | |::|--:|:|:| | +0 :ok: | reexec | 25m 37s | Docker mode activated. | ||| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 1s | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | The patch appears to include 3 new or modified test files. | ||| _ branch-3.3 Compile Tests _ | | +1 :green_heart: | mvninstall | 19m 44s | branch-3.3 passed | | +1 :green_heart: | compile | 0m 32s | branch-3.3 passed | | +1 :green_heart: | checkstyle | 0m 26s | branch-3.3 passed | | +1 :green_heart: | mvnsite | 0m 35s | branch-3.3 passed | | +1 :green_heart: | shadedclient | 14m 53s | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 27s | branch-3.3 passed | | +0 :ok: | spotbugs | 0m 53s | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 0m 50s | branch-3.3 passed | ||| _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 27s | the patch passed | | +1 :green_heart: | compile | 0m 24s | the patch passed | | +1 :green_heart: | javac | 0m 24s | the patch passed | | -0 :warning: | checkstyle | 0m 15s | hadoop-tools/hadoop-azure: The patch generated 2 new + 3 unchanged - 0 fixed = 5 total (was 3) | | +1 :green_heart: | mvnsite | 0m 27s | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 13m 52s | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 23s | the patch passed | | +1 :green_heart: | findbugs | 0m 54s | the patch passed | ||| _ Other Tests _ | | +1 :green_heart: | unit | 1m 15s | hadoop-azure in the patch passed. | | +1 :green_heart: | asflicense | 0m 31s | The patch does not generate ASF License warnings. | | | | 83m 11s | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://builds.apache.org/job/hadoop-multibranch/job/PR-2050/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2050 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux 4dd33a58902f 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | personality/hadoop.sh | | git revision | branch-3.3 / cf84bec | | Default Java | Private Build-1.8.0_252-8u252-b09-1~16.04-b09 | | checkstyle | https://builds.apache.org/job/hadoop-multibranch/job/PR-2050/1/artifact/out/diff-checkstyle-hadoop-tools_hadoop-azure.txt | | Test Results | https://builds.apache.org/job/hadoop-multibranch/job/PR-2050/1/testReport/ | | Max. process+thread count | 422 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-azure U: hadoop-tools/hadoop-azure | | Console output | https://builds.apache.org/job/hadoop-multibranch/job/PR-2050/1/console | | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 | | Powered by | Apache Yetus 0.12.0 https://yetus.apache.org | This message was automatically generated. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #2046: HADOOP-16202 Enhance S3A openFile()
steveloughran commented on pull request #2046: URL: https://github.com/apache/hadoop/pull/2046#issuecomment-638223870 tested s3 london, unguarded This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mehakmeet commented on pull request #2050: Backport HADOOP-17016.
mehakmeet commented on pull request #2050: URL: https://github.com/apache/hadoop/pull/2050#issuecomment-638185680 CC: @steveloughran This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] mehakmeet opened a new pull request #2050: Backport HADOOP-17016.
mehakmeet opened a new pull request #2050: URL: https://github.com/apache/hadoop/pull/2050 Contributed by: Mehakmeet Singh. Test run: mvn -T 1C -Dparallel-tests=abfs clean verify Region: East US, West US Tests in patch: ``` [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.272 s - in org.apache.hadoop.fs.azurebfs.TestAbfsStatistics [INFO] Running org.apache.hadoop.fs.azurebfs.ITestAbfsStatistics [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.836 s - in org.apache.hadoop.fs.azurebfs.ITestAbfsStatistics ``` All Test run results: ``` [INFO] Results: [INFO] [INFO] Tests run: 60, Failures: 0, Errors: 0, Skipped: 0 ``` ``` [INFO] Results: [INFO] [WARNING] Tests run: 423, Failures: 0, Errors: 0, Skipped: 62 ``` ``` [INFO] Results: [INFO] [WARNING] Tests run: 206, Failures: 0, Errors: 0, Skipped: 29 ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #2019: HADOOP-17029. Return correct permission and owner for listing on internal directories in ViewFs
hadoop-yetus commented on pull request #2019: URL: https://github.com/apache/hadoop/pull/2019#issuecomment-638124172 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Comment | |::|--:|:|:| | +0 :ok: | reexec | 0m 27s | Docker mode activated. | ||| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | The patch appears to include 1 new or modified test files. | ||| _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 21m 23s | trunk passed | | +1 :green_heart: | compile | 22m 12s | trunk passed | | +1 :green_heart: | checkstyle | 0m 57s | trunk passed | | +1 :green_heart: | mvnsite | 1m 49s | trunk passed | | +1 :green_heart: | shadedclient | 21m 7s | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 1m 1s | trunk passed | | +0 :ok: | spotbugs | 2m 37s | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 2m 33s | trunk passed | ||| _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 1m 0s | the patch passed | | +1 :green_heart: | compile | 21m 50s | the patch passed | | -1 :x: | javac | 21m 50s | root generated 2 new + 1857 unchanged - 2 fixed = 1859 total (was 1859) | | +1 :green_heart: | checkstyle | 0m 53s | the patch passed | | +1 :green_heart: | mvnsite | 1m 38s | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 17m 22s | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 1m 0s | the patch passed | | +1 :green_heart: | findbugs | 2m 32s | the patch passed | ||| _ Other Tests _ | | +1 :green_heart: | unit | 10m 40s | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 52s | The patch does not generate ASF License warnings. | | | | 129m 44s | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://builds.apache.org/job/hadoop-multibranch/job/PR-2019/6/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2019 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux e99a4c65b0a2 4.15.0-101-generic #102-Ubuntu SMP Mon May 11 10:07:26 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | personality/hadoop.sh | | git revision | trunk / e8cb2ae409b | | Default Java | Private Build-1.8.0_252-8u252-b09-1~18.04-b09 | | javac | https://builds.apache.org/job/hadoop-multibranch/job/PR-2019/6/artifact/out/diff-compile-javac-root.txt | | Test Results | https://builds.apache.org/job/hadoop-multibranch/job/PR-2019/6/testReport/ | | Max. process+thread count | 2466 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://builds.apache.org/job/hadoop-multibranch/job/PR-2019/6/console | | versions | git=2.17.1 maven=3.6.0 findbugs=3.1.0-RC1 | | Powered by | Apache Yetus 0.12.0 https://yetus.apache.org | This message was automatically generated. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-14566) Add seek support for SFTP FileSystem
[ https://issues.apache.org/jira/browse/HADOOP-14566?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17124854#comment-17124854 ] Hudson commented on HADOOP-14566: - SUCCESS: Integrated in Jenkins build Hadoop-trunk-Commit #18323 (See [https://builds.apache.org/job/Hadoop-trunk-Commit/18323/]) HADOOP-14566. Add seek support for SFTP FileSystem. (#1999) (github: rev 97c98ce531ccb27581cbb10260d7307b0ccd199c) * (edit) hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/contract/AbstractFSContractTestBase.java * (add) hadoop-common-project/hadoop-common/src/test/resources/contract/sftp.xml * (edit) hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/sftp/SFTPFileSystem.java * (add) hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/contract/sftp/SFTPContract.java * (edit) hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/contract/AbstractFSContract.java * (edit) hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/sftp/SFTPInputStream.java * (add) hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/fs/contract/sftp/TestSFTPContractSeek.java > Add seek support for SFTP FileSystem > > > Key: HADOOP-14566 > URL: https://issues.apache.org/jira/browse/HADOOP-14566 > Project: Hadoop Common > Issue Type: Improvement > Components: fs >Reporter: Azhagu Selvan SP >Assignee: Mikhail Pryakhin >Priority: Minor > Fix For: 3.3.1 > > Attachments: HADOOP-14566.001.patch, HADOOP-14566.patch > > > This patch adds seek() method implementation for SFTP FileSystem and a unit > test for the same -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #1679: HDFS-13934. Multipart uploaders to be created through FileSystem/FileContext.
steveloughran commented on pull request #1679: URL: https://github.com/apache/hadoop/pull/1679#issuecomment-638116319 need to rebase This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-14566) Add seek support for SFTP FileSystem
[ https://issues.apache.org/jira/browse/HADOOP-14566?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Steve Loughran updated HADOOP-14566: Fix Version/s: 3.3.1 Resolution: Fixed Status: Resolved (was: Patch Available) > Add seek support for SFTP FileSystem > > > Key: HADOOP-14566 > URL: https://issues.apache.org/jira/browse/HADOOP-14566 > Project: Hadoop Common > Issue Type: Improvement > Components: fs >Reporter: Azhagu Selvan SP >Assignee: Mikhail Pryakhin >Priority: Minor > Fix For: 3.3.1 > > Attachments: HADOOP-14566.001.patch, HADOOP-14566.patch > > > This patch adds seek() method implementation for SFTP FileSystem and a unit > test for the same -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran commented on pull request #1999: HADOOP-14566. Add seek support for SFTP FileSystem.
steveloughran commented on pull request #1999: URL: https://github.com/apache/hadoop/pull/1999#issuecomment-638112276 +1, merged to trunk and branch-3.3 thanks! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] steveloughran merged pull request #1999: HADOOP-14566. Add seek support for SFTP FileSystem.
steveloughran merged pull request #1999: URL: https://github.com/apache/hadoop/pull/1999 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus removed a comment on pull request #1041: HADOOP-15844. Tag S3GuardTool entry points as LimitedPrivate/Evolving
hadoop-yetus removed a comment on pull request #1041: URL: https://github.com/apache/hadoop/pull/1041#issuecomment-527284108 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Comment | |::|--:|:|:| | 0 | reexec | 251 | Docker mode activated. | ||| _ Prechecks _ | | +1 | dupname | 0 | No case conflicting files found. | | +1 | @author | 0 | The patch does not contain any @author tags. | | -1 | test4tests | 0 | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | ||| _ trunk Compile Tests _ | | +1 | mvninstall | 1766 | trunk passed | | +1 | compile | 47 | trunk passed | | +1 | checkstyle | 38 | trunk passed | | +1 | mvnsite | 51 | trunk passed | | +1 | shadedclient | 978 | branch has no errors when building and testing our client artifacts. | | +1 | javadoc | 39 | trunk passed | | 0 | spotbugs | 84 | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 | findbugs | 77 | trunk passed | ||| _ Patch Compile Tests _ | | +1 | mvninstall | 43 | the patch passed | | +1 | compile | 35 | the patch passed | | +1 | javac | 35 | the patch passed | | +1 | checkstyle | 22 | the patch passed | | +1 | mvnsite | 44 | the patch passed | | +1 | whitespace | 0 | The patch has no whitespace issues. | | +1 | shadedclient | 1008 | patch has no errors when building and testing our client artifacts. | | +1 | javadoc | 31 | the patch passed | | +1 | findbugs | 77 | the patch passed | ||| _ Other Tests _ | | +1 | unit | 103 | hadoop-aws in the patch passed. | | +1 | asflicense | 38 | The patch does not generate ASF License warnings. | | | | 4752 | | | Subsystem | Report/Notes | |--:|:-| | Docker | Client=19.03.1 Server=19.03.1 base: https://builds.apache.org/job/hadoop-multibranch/job/PR-1041/11/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/1041 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux 052be4ed554d 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | personality/hadoop.sh | | git revision | trunk / 915cbc9 | | Default Java | 1.8.0_222 | | Test Results | https://builds.apache.org/job/hadoop-multibranch/job/PR-1041/11/testReport/ | | Max. process+thread count | 412 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://builds.apache.org/job/hadoop-multibranch/job/PR-1041/11/console | | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 | | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org | This message was automatically generated. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus removed a comment on pull request #1441: HADOOP-16568. S3A FullCredentialsTokenBinding fails if local credentials are unset
hadoop-yetus removed a comment on pull request #1441: URL: https://github.com/apache/hadoop/pull/1441#issuecomment-531357805 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Comment | |::|--:|:|:| | 0 | reexec | 1178 | Docker mode activated. | ||| _ Prechecks _ | | +1 | dupname | 0 | No case conflicting files found. | | +1 | @author | 0 | The patch does not contain any @author tags. | | -1 | test4tests | 0 | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | ||| _ trunk Compile Tests _ | | +1 | mvninstall | 1196 | trunk passed | | +1 | compile | 31 | trunk passed | | +1 | checkstyle | 23 | trunk passed | | +1 | mvnsite | 36 | trunk passed | | +1 | shadedclient | 794 | branch has no errors when building and testing our client artifacts. | | +1 | javadoc | 25 | trunk passed | | 0 | spotbugs | 69 | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 | findbugs | 67 | trunk passed | ||| _ Patch Compile Tests _ | | +1 | mvninstall | 34 | the patch passed | | +1 | compile | 33 | the patch passed | | +1 | javac | 33 | the patch passed | | +1 | checkstyle | 21 | the patch passed | | +1 | mvnsite | 40 | the patch passed | | +1 | whitespace | 0 | The patch has no whitespace issues. | | +1 | shadedclient | 858 | patch has no errors when building and testing our client artifacts. | | +1 | javadoc | 26 | the patch passed | | +1 | findbugs | 71 | the patch passed | ||| _ Other Tests _ | | +1 | unit | 84 | hadoop-aws in the patch passed. | | +1 | asflicense | 28 | The patch does not generate ASF License warnings. | | | | 4644 | | | Subsystem | Report/Notes | |--:|:-| | Docker | Client=19.03.1 Server=19.03.1 base: https://builds.apache.org/job/hadoop-multibranch/job/PR-1441/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/1441 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux 83aa939dc532 4.15.0-60-generic #67-Ubuntu SMP Thu Aug 22 16:55:30 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | personality/hadoop.sh | | git revision | trunk / 1843c46 | | Default Java | 1.8.0_222 | | Test Results | https://builds.apache.org/job/hadoop-multibranch/job/PR-1441/1/testReport/ | | Max. process+thread count | 340 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://builds.apache.org/job/hadoop-multibranch/job/PR-1441/1/console | | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 | | Powered by | Apache Yetus 0.10.0 http://yetus.apache.org | This message was automatically generated. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus removed a comment on pull request #1441: HADOOP-16568. S3A FullCredentialsTokenBinding fails if local credentials are unset
hadoop-yetus removed a comment on pull request #1441: URL: https://github.com/apache/hadoop/pull/1441#issuecomment-596667837 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Comment | |::|--:|:|:| | +0 :ok: | reexec | 26m 1s | Docker mode activated. | ||| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | The patch does not contain any @author tags. | | -1 :x: | test4tests | 0m 0s | The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. | ||| _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 21m 19s | trunk passed | | +1 :green_heart: | compile | 0m 34s | trunk passed | | +1 :green_heart: | checkstyle | 0m 27s | trunk passed | | +1 :green_heart: | mvnsite | 0m 40s | trunk passed | | +1 :green_heart: | shadedclient | 14m 58s | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 29s | trunk passed | | +0 :ok: | spotbugs | 1m 2s | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 1m 0s | trunk passed | ||| _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 34s | the patch passed | | +1 :green_heart: | compile | 0m 28s | the patch passed | | +1 :green_heart: | javac | 0m 28s | the patch passed | | +1 :green_heart: | checkstyle | 0m 18s | the patch passed | | +1 :green_heart: | mvnsite | 0m 31s | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 14m 6s | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 0m 25s | the patch passed | | +1 :green_heart: | findbugs | 1m 3s | the patch passed | ||| _ Other Tests _ | | +1 :green_heart: | unit | 1m 12s | hadoop-aws in the patch passed. | | +1 :green_heart: | asflicense | 0m 30s | The patch does not generate ASF License warnings. | | | | 85m 56s | | | Subsystem | Report/Notes | |--:|:-| | Docker | Client=19.03.7 Server=19.03.7 base: https://builds.apache.org/job/hadoop-multibranch/job/PR-1441/2/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/1441 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux 902b8f5a1e53 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | personality/hadoop.sh | | git revision | trunk / 44afe11 | | Default Java | 1.8.0_242 | | Test Results | https://builds.apache.org/job/hadoop-multibranch/job/PR-1441/2/testReport/ | | Max. process+thread count | 459 (vs. ulimit of 5500) | | modules | C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws | | Console output | https://builds.apache.org/job/hadoop-multibranch/job/PR-1441/2/console | | versions | git=2.7.4 maven=3.3.9 findbugs=3.1.0-RC1 | | Powered by | Apache Yetus 0.11.1 https://yetus.apache.org | This message was automatically generated. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] Hexiaoqiao commented on a change in pull request #2047: HDFS-15383 Add support for router delegation token without watch
Hexiaoqiao commented on a change in pull request #2047: URL: https://github.com/apache/hadoop/pull/2047#discussion_r434421860 ## File path: hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/router/security/token/ZKDelegationTokenSecretManagerImpl.java ## @@ -19,38 +19,211 @@ package org.apache.hadoop.hdfs.server.federation.router.security.token; import org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenIdentifier; +import org.apache.hadoop.security.token.Token; import org.apache.hadoop.security.token.delegation.AbstractDelegationTokenIdentifier; import org.apache.hadoop.security.token.delegation.ZKDelegationTokenSecretManager; +import org.apache.hadoop.util.Time; +import org.apache.zookeeper.CreateMode; +import org.apache.zookeeper.KeeperException; +import org.apache.zookeeper.ZooKeeper; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.apache.hadoop.conf.Configuration; +import java.io.ByteArrayInputStream; +import java.io.DataInputStream; import java.io.IOException; +import java.util.HashSet; +import java.util.List; +import java.util.Set; +import java.util.concurrent.Executors; +import java.util.concurrent.ScheduledExecutorService; +import java.util.concurrent.TimeUnit; /** * Zookeeper based router delegation token store implementation. */ public class ZKDelegationTokenSecretManagerImpl extends ZKDelegationTokenSecretManager { + public static final String ZK_DTSM_ROUTER_TOKEN_SYNC_INTERVAL = + "zk-dt-secret-manager.router.token.sync.interval"; + public static final int ZK_DTSM_ROUTER_TOKEN_SYNC_INTERVAL_DEFAULT = 5; + private static final Logger LOG = LoggerFactory.getLogger(ZKDelegationTokenSecretManagerImpl.class); - private Configuration conf = null; + private Configuration conf; + + private final ScheduledExecutorService scheduler = + Executors.newSingleThreadScheduledExecutor(); + + // Local cache of delegation tokens, used for depracating tokens from Review comment: depracating -> deprecating? ## File path: hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/router/security/token/ZKDelegationTokenSecretManagerImpl.java ## @@ -19,38 +19,211 @@ package org.apache.hadoop.hdfs.server.federation.router.security.token; import org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenIdentifier; +import org.apache.hadoop.security.token.Token; import org.apache.hadoop.security.token.delegation.AbstractDelegationTokenIdentifier; import org.apache.hadoop.security.token.delegation.ZKDelegationTokenSecretManager; +import org.apache.hadoop.util.Time; +import org.apache.zookeeper.CreateMode; +import org.apache.zookeeper.KeeperException; +import org.apache.zookeeper.ZooKeeper; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.apache.hadoop.conf.Configuration; +import java.io.ByteArrayInputStream; +import java.io.DataInputStream; import java.io.IOException; +import java.util.HashSet; +import java.util.List; +import java.util.Set; +import java.util.concurrent.Executors; +import java.util.concurrent.ScheduledExecutorService; +import java.util.concurrent.TimeUnit; /** * Zookeeper based router delegation token store implementation. */ public class ZKDelegationTokenSecretManagerImpl extends ZKDelegationTokenSecretManager { + public static final String ZK_DTSM_ROUTER_TOKEN_SYNC_INTERVAL = + "zk-dt-secret-manager.router.token.sync.interval"; + public static final int ZK_DTSM_ROUTER_TOKEN_SYNC_INTERVAL_DEFAULT = 5; + private static final Logger LOG = LoggerFactory.getLogger(ZKDelegationTokenSecretManagerImpl.class); - private Configuration conf = null; + private Configuration conf; + + private final ScheduledExecutorService scheduler = + Executors.newSingleThreadScheduledExecutor(); + + // Local cache of delegation tokens, used for depracating tokens from + // currentTokenMap + private final Set localTokenCache = + new HashSet<>(); + // Native zk client for getting all tokens + private ZooKeeper zookeeper; + private final String TOKEN_PATH = "/" + zkClient.getNamespace() + + ZK_DTSM_TOKENS_ROOT; + // The flag used to issue an extra check before deletion + // Since cancel token and token remover thread use the same + // API here and one router could have a token that is renewed + // by another router, thus token remover should always check ZK + // to confirm whether it has been renewed or not + private ThreadLocal checkAgainstZkBeforeDeletion = + new ThreadLocal() { +@Override +protected Boolean initialValue() { + return true; +} + }; public ZKDelegationTokenSecretManagerImpl(Configuration conf) { super(conf); this.conf = conf; try { - super.startThreads(); + startThreads(); } catch (IOException e) { LOG.error("Error starting
[jira] [Commented] (HADOOP-17056) shelldoc fails in hadoop-common
[ https://issues.apache.org/jira/browse/HADOOP-17056?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17124790#comment-17124790 ] Hudson commented on HADOOP-17056: - SUCCESS: Integrated in Jenkins build Hadoop-trunk-Commit #18322 (See [https://builds.apache.org/job/Hadoop-trunk-Commit/18322/]) HADOOP-17056. shelldoc fails in hadoop-common. (#2045) (github: rev 9c290c08db4361de29f392b0569312c2623b8321) * (edit) dev-support/docker/Dockerfile_aarch64 * (edit) dev-support/bin/yetus-wrapper * (edit) dev-support/docker/Dockerfile > shelldoc fails in hadoop-common > --- > > Key: HADOOP-17056 > URL: https://issues.apache.org/jira/browse/HADOOP-17056 > Project: Hadoop Common > Issue Type: Bug > Components: build >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Major > Attachments: 2040.02.patch, 2040.03.patch, 2040.patch, > HADOOP-17056-test-01.patch, HADOOP-17056-test-02.patch, > HADOOP-17056-test-03.patch, HADOOP-17056.01.patch > > > {noformat} > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > > ERROR: yetus-dl: gpg unable to import > > /home/jenkins/jenkins-slave/workspace/PreCommit-HADOOP-Build/sourcedir/patchprocess/KEYS_YETUS > > [INFO] > > > > [INFO] BUILD FAILURE > > [INFO] > > > > [INFO] Total time: 9.377 s > > [INFO] Finished at: 2020-05-28T17:37:41Z > > [INFO] > > > > [ERROR] Failed to execute goal > > org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (shelldocs) on project > > hadoop-common: Command execution failed. Process exited with an error: 1 > > (Exit value: 1) -> [Help 1] > > [ERROR] > > [ERROR] To see the full stack trace of the errors, re-run Maven with the > > -e switch. > > [ERROR] Re-run Maven using the -X switch to enable full debug logging. > > [ERROR] > > [ERROR] For more information about the errors and possible solutions, > > please read the following articles: > > [ERROR] [Help 1] > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException > {noformat} > * > https://builds.apache.org/job/PreCommit-HADOOP-Build/16957/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/155/artifact/out/patch-mvnsite-root.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/157/artifact/out/patch-mvnsite-root.txt -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on pull request #2045: HADOOP-17056. shelldoc fails in hadoop-common.
aajisaka commented on pull request #2045: URL: https://github.com/apache/hadoop/pull/2045#issuecomment-638066397 I found a typo after committing this. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka merged pull request #2045: HADOOP-17056. shelldoc fails in hadoop-common.
aajisaka merged pull request #2045: URL: https://github.com/apache/hadoop/pull/2045 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on pull request #2045: HADOOP-17056. shelldoc fails in hadoop-common.
aajisaka commented on pull request #2045: URL: https://github.com/apache/hadoop/pull/2045#issuecomment-638064220 Thank you @iwasakims and @ayushtkn for your review. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] vinayakumarb commented on pull request #2026: HADOOP-17046. Support downstreams' existing Hadoop-rpc implementations using non-shaded protobuf classes
vinayakumarb commented on pull request #2026: URL: https://github.com/apache/hadoop/pull/2026#issuecomment-638060134 Updated the PR with review comment fixes. + below changes `ProtobufRpcEngineProto.java` is generated using protobuf 2.5.0, which does not available for arm. So I have added the generated `ProtobufRpcEngineProto.java` in a separare source directory in `hadoop-common\src\main\arm-java` and adding this to sources in case of arm. But Re-generating using protobuf-maven-plugin in case of x86 platform as before. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] vinayakumarb commented on a change in pull request #2026: HADOOP-17046. Support downstreams' existing Hadoop-rpc implementations using non-shaded protobuf classes
vinayakumarb commented on a change in pull request #2026: URL: https://github.com/apache/hadoop/pull/2026#discussion_r434410613 ## File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/ProtobufRpcEngine.java ## @@ -433,7 +438,15 @@ public Server(Class protocolClass, Object protocolImpl, registerProtocolAndImpl(RPC.RpcKind.RPC_PROTOCOL_BUFFER, protocolClass, protocolImpl); } - + +@Override +protected RpcInvoker getServerRpcInvoker(RpcKind rpcKind) { + if (rpcKind == RpcKind.RPC_PROTOCOL_BUFFER) { +return RPC_INVOKER; + } + return super.getServerRpcInvoker(rpcKind); +} + Review comment: Done This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] hadoop-yetus commented on pull request #2019: HADOOP-17029. Return correct permission and owner for listing on internal directories in ViewFs
hadoop-yetus commented on pull request #2019: URL: https://github.com/apache/hadoop/pull/2019#issuecomment-638049727 :broken_heart: **-1 overall** | Vote | Subsystem | Runtime | Comment | |::|--:|:|:| | +0 :ok: | reexec | 0m 32s | Docker mode activated. | ||| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | The patch does not contain any @author tags. | | +1 :green_heart: | test4tests | 0m 0s | The patch appears to include 1 new or modified test files. | ||| _ trunk Compile Tests _ | | +1 :green_heart: | mvninstall | 18m 40s | trunk passed | | +1 :green_heart: | compile | 16m 57s | trunk passed | | +1 :green_heart: | checkstyle | 0m 54s | trunk passed | | +1 :green_heart: | mvnsite | 1m 29s | trunk passed | | +1 :green_heart: | shadedclient | 16m 35s | branch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 1m 2s | trunk passed | | +0 :ok: | spotbugs | 2m 11s | Used deprecated FindBugs config; considering switching to SpotBugs. | | +1 :green_heart: | findbugs | 2m 9s | trunk passed | ||| _ Patch Compile Tests _ | | +1 :green_heart: | mvninstall | 0m 49s | the patch passed | | +1 :green_heart: | compile | 16m 17s | the patch passed | | -1 :x: | javac | 16m 17s | root generated 2 new + 1857 unchanged - 2 fixed = 1859 total (was 1859) | | -0 :warning: | checkstyle | 0m 54s | hadoop-common-project/hadoop-common: The patch generated 1 new + 93 unchanged - 0 fixed = 94 total (was 93) | | +1 :green_heart: | mvnsite | 1m 27s | the patch passed | | +1 :green_heart: | whitespace | 0m 0s | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 14m 16s | patch has no errors when building and testing our client artifacts. | | +1 :green_heart: | javadoc | 1m 2s | the patch passed | | +1 :green_heart: | findbugs | 2m 15s | the patch passed | ||| _ Other Tests _ | | +1 :green_heart: | unit | 9m 24s | hadoop-common in the patch passed. | | +1 :green_heart: | asflicense | 0m 52s | The patch does not generate ASF License warnings. | | | | 106m 41s | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://builds.apache.org/job/hadoop-multibranch/job/PR-2019/5/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2019 | | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle | | uname | Linux c93feccbbf4e 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | personality/hadoop.sh | | git revision | trunk / 6288e15118f | | Default Java | Private Build-1.8.0_252-8u252-b09-1~18.04-b09 | | javac | https://builds.apache.org/job/hadoop-multibranch/job/PR-2019/5/artifact/out/diff-compile-javac-root.txt | | checkstyle | https://builds.apache.org/job/hadoop-multibranch/job/PR-2019/5/artifact/out/diff-checkstyle-hadoop-common-project_hadoop-common.txt | | Test Results | https://builds.apache.org/job/hadoop-multibranch/job/PR-2019/5/testReport/ | | Max. process+thread count | 1421 (vs. ulimit of 5500) | | modules | C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common | | Console output | https://builds.apache.org/job/hadoop-multibranch/job/PR-2019/5/console | | versions | git=2.17.1 maven=3.6.0 findbugs=3.1.0-RC1 | | Powered by | Apache Yetus 0.12.0 https://yetus.apache.org | This message was automatically generated. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17056) shelldoc fails in hadoop-common
[ https://issues.apache.org/jira/browse/HADOOP-17056?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17124739#comment-17124739 ] Ayush Saxena commented on HADOOP-17056: --- Thanx [~aajisaka] for fixing this. +1 for the latest patch > shelldoc fails in hadoop-common > --- > > Key: HADOOP-17056 > URL: https://issues.apache.org/jira/browse/HADOOP-17056 > Project: Hadoop Common > Issue Type: Bug > Components: build >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Major > Attachments: 2040.02.patch, 2040.03.patch, 2040.patch, > HADOOP-17056-test-01.patch, HADOOP-17056-test-02.patch, > HADOOP-17056-test-03.patch, HADOOP-17056.01.patch > > > {noformat} > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > > ERROR: yetus-dl: gpg unable to import > > /home/jenkins/jenkins-slave/workspace/PreCommit-HADOOP-Build/sourcedir/patchprocess/KEYS_YETUS > > [INFO] > > > > [INFO] BUILD FAILURE > > [INFO] > > > > [INFO] Total time: 9.377 s > > [INFO] Finished at: 2020-05-28T17:37:41Z > > [INFO] > > > > [ERROR] Failed to execute goal > > org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (shelldocs) on project > > hadoop-common: Command execution failed. Process exited with an error: 1 > > (Exit value: 1) -> [Help 1] > > [ERROR] > > [ERROR] To see the full stack trace of the errors, re-run Maven with the > > -e switch. > > [ERROR] Re-run Maven using the -X switch to enable full debug logging. > > [ERROR] > > [ERROR] For more information about the errors and possible solutions, > > please read the following articles: > > [ERROR] [Help 1] > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException > {noformat} > * > https://builds.apache.org/job/PreCommit-HADOOP-Build/16957/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/155/artifact/out/patch-mvnsite-root.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/157/artifact/out/patch-mvnsite-root.txt -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17056) shelldoc fails in hadoop-common
[ https://issues.apache.org/jira/browse/HADOOP-17056?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17124727#comment-17124727 ] Hadoop QA commented on HADOOP-17056: | (/) *{color:green}+1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || | {color:blue}0{color} | {color:blue} reexec {color} | {color:blue} 1m 26s{color} | {color:blue} Docker mode activated. {color} | || || || || {color:brown} Prechecks {color} || | {color:green}+1{color} | {color:green} dupname {color} | {color:green} 0m 0s{color} | {color:green} No case conflicting files found. {color} | | {color:blue}0{color} | {color:blue} shelldocs {color} | {color:blue} 0m 0s{color} | {color:blue} Shelldocs was not available. {color} | | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} trunk Compile Tests {color} || | {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 20m 35s{color} | {color:green} branch has no errors when building and testing our client artifacts. {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:green}+1{color} | {color:green} hadolint {color} | {color:green} 0m 5s{color} | {color:green} There were no new hadolint issues. {color} | | {color:green}+1{color} | {color:green} shellcheck {color} | {color:green} 0m 1s{color} | {color:green} There were no new shellcheck issues. {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s{color} | {color:green} The patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 15m 20s{color} | {color:green} patch has no errors when building and testing our client artifacts. {color} | || || || || {color:brown} Other Tests {color} || | {color:green}+1{color} | {color:green} asflicense {color} | {color:green} 0m 32s{color} | {color:green} The patch does not generate ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 38m 42s{color} | {color:black} {color} | \\ \\ || Subsystem || Report/Notes || | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://builds.apache.org/job/PreCommit-HADOOP-Build/16966/artifact/out/Dockerfile | | JIRA Issue | HADOOP-17056 | | JIRA Patch URL | https://issues.apache.org/jira/secure/attachment/13004685/HADOOP-17056.01.patch | | Optional Tests | dupname asflicense shellcheck shelldocs hadolint | | uname | Linux 2d6473f8c3e4 4.15.0-91-generic #92-Ubuntu SMP Fri Feb 28 11:09:48 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | personality/hadoop.sh | | git revision | trunk / e8cb2ae409b | | Max. process+thread count | 320 (vs. ulimit of 5500) | | modules | C: . U: . | | Console output | https://builds.apache.org/job/PreCommit-HADOOP-Build/16966/console | | versions | git=2.17.1 maven=3.6.0 shellcheck=0.4.6 hadolint=1.11.1-0-g0e692dd | | Powered by | Apache Yetus 0.12.0 https://yetus.apache.org | This message was automatically generated. > shelldoc fails in hadoop-common > --- > > Key: HADOOP-17056 > URL: https://issues.apache.org/jira/browse/HADOOP-17056 > Project: Hadoop Common > Issue Type: Bug > Components: build >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Major > Attachments: 2040.02.patch, 2040.03.patch, 2040.patch, > HADOOP-17056-test-01.patch, HADOOP-17056-test-02.patch, > HADOOP-17056-test-03.patch, HADOOP-17056.01.patch > > > {noformat} > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > > ERROR: yetus-dl: gpg unable to import > > /home/jenkins/jenkins-slave/workspace/PreCommit-HADOOP-Build/sourcedir/patchprocess/KEYS_YETUS > > [INFO] > > > > [INFO] BUILD FAILURE > > [INFO] > > > > [INFO] Total time: 9.377 s > > [INFO] Finished at: 2020-05-28T17:37:41Z > > [INFO] > > > > [ERROR] Failed to execute goal > > org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (shelldocs) on project > > hadoop-common: Command execution failed. Process exited with an error: 1 > > (Exit value: 1) -> [Help 1] > > [ERROR] > > [ERROR] To see the full stack trace of the errors, re-run Maven with the > > -e switch. > > [ERROR] Re-run Maven using the -X switch to enable full debug logging. > > [ERROR] > > [ERROR] For more information about the errors and possible solutions, > > please read the following articles: > > [ERROR] [Help 1] > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException > {noformat} >
[GitHub] [hadoop] hadoop-yetus commented on pull request #2049: HADOOP-17062. Fix shelldocs path in Jenkinsfile
hadoop-yetus commented on pull request #2049: URL: https://github.com/apache/hadoop/pull/2049#issuecomment-638010208 :confetti_ball: **+1 overall** | Vote | Subsystem | Runtime | Comment | |::|--:|:|:| | +0 :ok: | reexec | 1m 30s | Docker mode activated. | ||| _ Prechecks _ | | +1 :green_heart: | dupname | 0m 0s | No case conflicting files found. | | +1 :green_heart: | @author | 0m 0s | The patch does not contain any @author tags. | ||| _ trunk Compile Tests _ | | +1 :green_heart: | shadedclient | 16m 21s | branch has no errors when building and testing our client artifacts. | ||| _ Patch Compile Tests _ | | +1 :green_heart: | shellcheck | 0m 0s | There were no new shellcheck issues. | | +1 :green_heart: | shelldocs | 0m 15s | There were no new shelldocs issues. | | +1 :green_heart: | whitespace | 0m 0s | The patch has no whitespace issues. | | +1 :green_heart: | shadedclient | 16m 2s | patch has no errors when building and testing our client artifacts. | ||| _ Other Tests _ | | +1 :green_heart: | asflicense | 0m 31s | The patch does not generate ASF License warnings. | | | | 36m 21s | | | Subsystem | Report/Notes | |--:|:-| | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://builds.apache.org/job/hadoop-multibranch/job/PR-2049/1/artifact/out/Dockerfile | | GITHUB PR | https://github.com/apache/hadoop/pull/2049 | | Optional Tests | dupname asflicense shellcheck shelldocs | | uname | Linux c7de8b8e5e7c 4.15.0-101-generic #102-Ubuntu SMP Mon May 11 10:07:26 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | personality/hadoop.sh | | git revision | trunk / 6288e15118f | | Max. process+thread count | 308 (vs. ulimit of 5500) | | modules | C: . U: . | | Console output | https://builds.apache.org/job/hadoop-multibranch/job/PR-2049/1/console | | versions | git=2.17.1 maven=3.6.0 shellcheck=0.4.6 | | Powered by | Apache Yetus 0.12.0 https://yetus.apache.org | This message was automatically generated. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17056) shelldoc fails in hadoop-common
[ https://issues.apache.org/jira/browse/HADOOP-17056?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17124686#comment-17124686 ] Hadoop QA commented on HADOOP-17056: (!) A patch to the testing environment has been detected. Re-executing against the patched versions to perform further tests. The console is at https://builds.apache.org/job/PreCommit-HADOOP-Build/16966/console in case of problems. > shelldoc fails in hadoop-common > --- > > Key: HADOOP-17056 > URL: https://issues.apache.org/jira/browse/HADOOP-17056 > Project: Hadoop Common > Issue Type: Bug > Components: build >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Major > Attachments: 2040.02.patch, 2040.03.patch, 2040.patch, > HADOOP-17056-test-01.patch, HADOOP-17056-test-02.patch, > HADOOP-17056-test-03.patch, HADOOP-17056.01.patch > > > {noformat} > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > > ERROR: yetus-dl: gpg unable to import > > /home/jenkins/jenkins-slave/workspace/PreCommit-HADOOP-Build/sourcedir/patchprocess/KEYS_YETUS > > [INFO] > > > > [INFO] BUILD FAILURE > > [INFO] > > > > [INFO] Total time: 9.377 s > > [INFO] Finished at: 2020-05-28T17:37:41Z > > [INFO] > > > > [ERROR] Failed to execute goal > > org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (shelldocs) on project > > hadoop-common: Command execution failed. Process exited with an error: 1 > > (Exit value: 1) -> [Help 1] > > [ERROR] > > [ERROR] To see the full stack trace of the errors, re-run Maven with the > > -e switch. > > [ERROR] Re-run Maven using the -X switch to enable full debug logging. > > [ERROR] > > [ERROR] For more information about the errors and possible solutions, > > please read the following articles: > > [ERROR] [Help 1] > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException > {noformat} > * > https://builds.apache.org/job/PreCommit-HADOOP-Build/16957/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/155/artifact/out/patch-mvnsite-root.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/157/artifact/out/patch-mvnsite-root.txt -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-17056) shelldoc fails in hadoop-common
[ https://issues.apache.org/jira/browse/HADOOP-17056?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Akira Ajisaka updated HADOOP-17056: --- Attachment: HADOOP-17056.01.patch > shelldoc fails in hadoop-common > --- > > Key: HADOOP-17056 > URL: https://issues.apache.org/jira/browse/HADOOP-17056 > Project: Hadoop Common > Issue Type: Bug > Components: build >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Major > Attachments: 2040.02.patch, 2040.03.patch, 2040.patch, > HADOOP-17056-test-01.patch, HADOOP-17056-test-02.patch, > HADOOP-17056-test-03.patch, HADOOP-17056.01.patch > > > {noformat} > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > > ERROR: yetus-dl: gpg unable to import > > /home/jenkins/jenkins-slave/workspace/PreCommit-HADOOP-Build/sourcedir/patchprocess/KEYS_YETUS > > [INFO] > > > > [INFO] BUILD FAILURE > > [INFO] > > > > [INFO] Total time: 9.377 s > > [INFO] Finished at: 2020-05-28T17:37:41Z > > [INFO] > > > > [ERROR] Failed to execute goal > > org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (shelldocs) on project > > hadoop-common: Command execution failed. Process exited with an error: 1 > > (Exit value: 1) -> [Help 1] > > [ERROR] > > [ERROR] To see the full stack trace of the errors, re-run Maven with the > > -e switch. > > [ERROR] Re-run Maven using the -X switch to enable full debug logging. > > [ERROR] > > [ERROR] For more information about the errors and possible solutions, > > please read the following articles: > > [ERROR] [Help 1] > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException > {noformat} > * > https://builds.apache.org/job/PreCommit-HADOOP-Build/16957/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/155/artifact/out/patch-mvnsite-root.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/157/artifact/out/patch-mvnsite-root.txt -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Commented] (HADOOP-17056) shelldoc fails in hadoop-common
[ https://issues.apache.org/jira/browse/HADOOP-17056?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17124679#comment-17124679 ] Akira Ajisaka commented on HADOOP-17056: Removed the empty line change in hadoop-funcions.sh in [https://github.com/apache/hadoop/pull/2045]. I'll upload the final patch here to help tracking this issue. > shelldoc fails in hadoop-common > --- > > Key: HADOOP-17056 > URL: https://issues.apache.org/jira/browse/HADOOP-17056 > Project: Hadoop Common > Issue Type: Bug > Components: build >Reporter: Akira Ajisaka >Assignee: Akira Ajisaka >Priority: Major > Attachments: 2040.02.patch, 2040.03.patch, 2040.patch, > HADOOP-17056-test-01.patch, HADOOP-17056-test-02.patch, > HADOOP-17056-test-03.patch > > > {noformat} > [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common --- > > ERROR: yetus-dl: gpg unable to import > > /home/jenkins/jenkins-slave/workspace/PreCommit-HADOOP-Build/sourcedir/patchprocess/KEYS_YETUS > > [INFO] > > > > [INFO] BUILD FAILURE > > [INFO] > > > > [INFO] Total time: 9.377 s > > [INFO] Finished at: 2020-05-28T17:37:41Z > > [INFO] > > > > [ERROR] Failed to execute goal > > org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (shelldocs) on project > > hadoop-common: Command execution failed. Process exited with an error: 1 > > (Exit value: 1) -> [Help 1] > > [ERROR] > > [ERROR] To see the full stack trace of the errors, re-run Maven with the > > -e switch. > > [ERROR] Re-run Maven using the -X switch to enable full debug logging. > > [ERROR] > > [ERROR] For more information about the errors and possible solutions, > > please read the following articles: > > [ERROR] [Help 1] > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException > {noformat} > * > https://builds.apache.org/job/PreCommit-HADOOP-Build/16957/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/155/artifact/out/patch-mvnsite-root.txt > * > https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/157/artifact/out/patch-mvnsite-root.txt -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Updated] (HADOOP-16761) KMSClientProvider does not work with client using ticket logged in externally
[ https://issues.apache.org/jira/browse/HADOOP-16761?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Jitendra Nath Pandey updated HADOOP-16761: -- Priority: Blocker (was: Major) > KMSClientProvider does not work with client using ticket logged in externally > -- > > Key: HADOOP-16761 > URL: https://issues.apache.org/jira/browse/HADOOP-16761 > Project: Hadoop Common > Issue Type: Bug >Reporter: Xiaoyu Yao >Assignee: Xiaoyu Yao >Priority: Blocker > > This is a regression from HDFS-13682 that checks not only the kerberos > credential but also enforce the login is non-external. This breaks client > applications that need to access HDFS encrypted file using kerberos ticket > that logged in external in ticket cache. > > -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[GitHub] [hadoop] aajisaka commented on pull request #2048: HADOOP-17061. Fix broken links in AWS documentation.
aajisaka commented on pull request #2048: URL: https://github.com/apache/hadoop/pull/2048#issuecomment-637999066 -1 The links are correct in the HTML file. https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html HTML documents are generated from the markdown files, and it is not supposed for users/developers to read the markdown files directly. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org