[
https://issues.apache.org/jira/browse/HDFS-15613?focusedWorklogId=495380&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-495380
]
ASF GitHub Bot logged work on HDFS-15613:
-----------------------------------------
Author: ASF GitHub Bot
Created on: 05/Oct/20 15:06
Start Date: 05/Oct/20 15:06
Worklog Time Spent: 10m
Work Description: hadoop-yetus commented on pull request #2360:
URL: https://github.com/apache/hadoop/pull/2360#issuecomment-703693036
:broken_heart: **-1 overall**
| Vote | Subsystem | Runtime | Logfile | Comment |
|:----:|----------:|--------:|:--------:|:-------:|
| +0 :ok: | reexec | 3m 30s | | Docker mode activated. |
|||| _ Prechecks _ |
| +1 :green_heart: | dupname | 0m 0s | | No case conflicting files
found. |
| +1 :green_heart: | @author | 0m 1s | | The patch does not contain
any @author tags. |
| -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include
any new or modified tests. Please justify why no new tests are needed for this
patch. Also please list what manual steps were performed to verify this patch.
|
|||| _ trunk Compile Tests _ |
| -1 :x: | mvninstall | 28m 25s |
[/branch-mvninstall-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2360/1/artifact/out/branch-mvninstall-root.txt)
| root in trunk failed. |
| +1 :green_heart: | compile | 3m 5s | | trunk passed with JDK
Ubuntu-11.0.8+10-post-Ubuntu-0ubuntu118.04.1 |
| +1 :green_heart: | compile | 2m 35s | | trunk passed with JDK
Private Build-1.8.0_265-8u265-b01-0ubuntu2~18.04-b01 |
| +1 :green_heart: | checkstyle | 1m 33s | | trunk passed |
| +1 :green_heart: | mvnsite | 2m 33s | | trunk passed |
| +1 :green_heart: | shadedclient | 26m 26s | | branch has no errors
when building and testing our client artifacts. |
| +1 :green_heart: | javadoc | 1m 15s | | trunk passed with JDK
Ubuntu-11.0.8+10-post-Ubuntu-0ubuntu118.04.1 |
| +1 :green_heart: | javadoc | 1m 53s | | trunk passed with JDK
Private Build-1.8.0_265-8u265-b01-0ubuntu2~18.04-b01 |
| +0 :ok: | spotbugs | 4m 36s | | Used deprecated FindBugs config;
considering switching to SpotBugs. |
| +1 :green_heart: | findbugs | 4m 31s | | trunk passed |
|||| _ Patch Compile Tests _ |
| +1 :green_heart: | mvninstall | 1m 50s | | the patch passed |
| +1 :green_heart: | compile | 1m 38s | | the patch passed with JDK
Ubuntu-11.0.8+10-post-Ubuntu-0ubuntu118.04.1 |
| +1 :green_heart: | javac | 1m 38s | | the patch passed |
| +1 :green_heart: | compile | 1m 28s | | the patch passed with JDK
Private Build-1.8.0_265-8u265-b01-0ubuntu2~18.04-b01 |
| +1 :green_heart: | javac | 1m 28s | | the patch passed |
| -0 :warning: | checkstyle | 0m 52s |
[/diff-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2360/1/artifact/out/diff-checkstyle-hadoop-hdfs-project_hadoop-hdfs.txt)
| hadoop-hdfs-project/hadoop-hdfs: The patch generated 1 new + 21 unchanged -
0 fixed = 22 total (was 21) |
| +1 :green_heart: | mvnsite | 1m 36s | | the patch passed |
| +1 :green_heart: | whitespace | 0m 0s | | The patch has no
whitespace issues. |
| +1 :green_heart: | shadedclient | 18m 45s | | patch has no errors
when building and testing our client artifacts. |
| +1 :green_heart: | javadoc | 1m 12s | | the patch passed with JDK
Ubuntu-11.0.8+10-post-Ubuntu-0ubuntu118.04.1 |
| +1 :green_heart: | javadoc | 2m 11s | | the patch passed with JDK
Private Build-1.8.0_265-8u265-b01-0ubuntu2~18.04-b01 |
| +1 :green_heart: | findbugs | 5m 5s | | the patch passed |
|||| _ Other Tests _ |
| -1 :x: | unit | 167m 12s |
[/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2360/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt)
| hadoop-hdfs in the patch passed. |
| +1 :green_heart: | asflicense | 1m 20s | | The patch does not
generate ASF License warnings. |
| | | 281m 3s | | |
| Reason | Tests |
|-------:|:------|
| Failed junit tests | hadoop.hdfs.TestFileChecksumCompositeCrc |
| | hadoop.hdfs.server.blockmanagement.TestBlockTokenWithDFSStriped |
| | hadoop.hdfs.TestFileChecksum |
| | hadoop.hdfs.server.sps.TestExternalStoragePolicySatisfier |
| | hadoop.hdfs.server.namenode.TestSnapshotPathINodes |
| | hadoop.hdfs.TestBlocksScheduledCounter |
| | hadoop.hdfs.server.namenode.ha.TestHAAppend |
| | hadoop.hdfs.TestMultipleNNPortQOP |
| | hadoop.hdfs.server.namenode.ha.TestRetryCacheWithHA |
| | hadoop.hdfs.TestDFSShell |
| | hadoop.hdfs.server.namenode.TestNamenodeStorageDirectives |
| | hadoop.hdfs.server.namenode.ha.TestBootstrapStandby |
| Subsystem | Report/Notes |
|----------:|:-------------|
| Docker | ClientAPI=1.40 ServerAPI=1.40 base:
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2360/1/artifact/out/Dockerfile
|
| GITHUB PR | https://github.com/apache/hadoop/pull/2360 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall
mvnsite unit shadedclient findbugs checkstyle |
| uname | Linux cec23378c81c 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9
23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | dev-support/bin/hadoop.sh |
| git revision | trunk / 7d4bcb312bd |
| Default Java | Private Build-1.8.0_265-8u265-b01-0ubuntu2~18.04-b01 |
| Multi-JDK versions |
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.8+10-post-Ubuntu-0ubuntu118.04.1
/usr/lib/jvm/java-8-openjdk-amd64:Private
Build-1.8.0_265-8u265-b01-0ubuntu2~18.04-b01 |
| Test Results |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2360/1/testReport/ |
| Max. process+thread count | 2777 (vs. ulimit of 5500) |
| modules | C: hadoop-hdfs-project/hadoop-hdfs U:
hadoop-hdfs-project/hadoop-hdfs |
| Console output |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2360/1/console |
| versions | git=2.17.1 maven=3.6.0 findbugs=4.0.6 |
| Powered by | Apache Yetus 0.13.0-SNAPSHOT https://yetus.apache.org |
This message was automatically generated.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 495380)
Time Spent: 20m (was: 10m)
> RBF: Router FSCK fails after HDFS-14442
> ---------------------------------------
>
> Key: HDFS-15613
> URL: https://issues.apache.org/jira/browse/HDFS-15613
> Project: Hadoop HDFS
> Issue Type: Sub-task
> Components: rbf
> Affects Versions: 3.3.0
> Environment: HA is enabled
> Reporter: Akira Ajisaka
> Assignee: Akira Ajisaka
> Priority: Major
> Labels: pull-request-available
> Time Spent: 20m
> Remaining Estimate: 0h
>
> After HDFS-14442 fsck uses getHAServiceState operation to detect Active
> NameNode, however, DFSRouter does not support the operation.
> {noformat}
> 20/10/05 16:41:30 DEBUG hdfs.HAUtil: Error while connecting to namenode
> org.apache.hadoop.ipc.RemoteException(java.lang.UnsupportedOperationException):
> Operation "getHAServiceState" is not supported
> at
> org.apache.hadoop.hdfs.server.federation.router.RouterRpcServer.checkOperation(RouterRpcServer.java:488)
> at
> org.apache.hadoop.hdfs.server.federation.router.RouterClientProtocol.getHAServiceState(RouterClientProtocol.java:1773)
> at
> org.apache.hadoop.hdfs.server.federation.router.RouterRpcServer.getHAServiceState(RouterRpcServer.java:1333)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getHAServiceState(ClientNamenodeProtocolServerSideTranslatorPB.java:2011)
> at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:532)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1070)
> at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1020)
> at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:948)
> at java.base/java.security.AccessController.doPrivileged(Native Method)
> at java.base/javax.security.auth.Subject.doAs(Subject.java:423)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1845)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2952)
> at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1562)
> at org.apache.hadoop.ipc.Client.call(Client.java:1508)
> at org.apache.hadoop.ipc.Client.call(Client.java:1405)
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:234)
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:119)
> at com.sun.proxy.$Proxy12.getHAServiceState(Unknown Source)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getHAServiceState(ClientNamenodeProtocolTranslatorPB.java:2055)
> at org.apache.hadoop.hdfs.HAUtil.getAddressOfActive(HAUtil.java:281)
> at
> org.apache.hadoop.hdfs.tools.DFSck.getCurrentNamenodeAddress(DFSck.java:271)
> at org.apache.hadoop.hdfs.tools.DFSck.doWork(DFSck.java:339)
> at org.apache.hadoop.hdfs.tools.DFSck.access$000(DFSck.java:75)
> at org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:164)
> at org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:161)
> at java.base/java.security.AccessController.doPrivileged(Native Method)
> at java.base/javax.security.auth.Subject.doAs(Subject.java:423)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1845)
> at org.apache.hadoop.hdfs.tools.DFSck.run(DFSck.java:160)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
> at org.apache.hadoop.hdfs.tools.DFSck.main(DFSck.java:409)
> {noformat}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]