[
https://issues.apache.org/jira/browse/HDFS-16154?focusedWorklogId=634660&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-634660
]
ASF GitHub Bot logged work on HDFS-16154:
-----------------------------------------
Author: ASF GitHub Bot
Created on: 05/Aug/21 16:32
Start Date: 05/Aug/21 16:32
Worklog Time Spent: 10m
Work Description: hadoop-yetus commented on pull request #3270:
URL: https://github.com/apache/hadoop/pull/3270#issuecomment-893598141
:confetti_ball: **+1 overall**
| Vote | Subsystem | Runtime | Logfile | Comment |
|:----:|----------:|--------:|:--------:|:-------:|
| +0 :ok: | reexec | 0m 45s | | Docker mode activated. |
|||| _ Prechecks _ |
| +1 :green_heart: | dupname | 0m 0s | | No case conflicting files
found. |
| +0 :ok: | codespell | 0m 0s | | codespell was not available. |
| +1 :green_heart: | @author | 0m 0s | | The patch does not contain
any @author tags. |
| +1 :green_heart: | test4tests | 0m 0s | | The patch appears to
include 1 new or modified test files. |
|||| _ trunk Compile Tests _ |
| +1 :green_heart: | mvninstall | 30m 52s | | trunk passed |
| +1 :green_heart: | compile | 1m 23s | | trunk passed with JDK
Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 |
| +1 :green_heart: | compile | 1m 17s | | trunk passed with JDK
Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
| +1 :green_heart: | checkstyle | 1m 0s | | trunk passed |
| +1 :green_heart: | mvnsite | 1m 24s | | trunk passed |
| +1 :green_heart: | javadoc | 0m 58s | | trunk passed with JDK
Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 |
| +1 :green_heart: | javadoc | 1m 30s | | trunk passed with JDK
Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
| +1 :green_heart: | spotbugs | 3m 13s | | trunk passed |
| +1 :green_heart: | shadedclient | 16m 19s | | branch has no errors
when building and testing our client artifacts. |
|||| _ Patch Compile Tests _ |
| +1 :green_heart: | mvninstall | 1m 11s | | the patch passed |
| +1 :green_heart: | compile | 1m 15s | | the patch passed with JDK
Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 |
| +1 :green_heart: | javac | 1m 15s | | the patch passed |
| +1 :green_heart: | compile | 1m 10s | | the patch passed with JDK
Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
| +1 :green_heart: | javac | 1m 10s | | the patch passed |
| +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks
issues. |
| +1 :green_heart: | checkstyle | 0m 51s | | the patch passed |
| +1 :green_heart: | mvnsite | 1m 14s | | the patch passed |
| +1 :green_heart: | javadoc | 0m 46s | | the patch passed with JDK
Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 |
| +1 :green_heart: | javadoc | 1m 25s | | the patch passed with JDK
Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
| +1 :green_heart: | spotbugs | 3m 9s | | the patch passed |
| +1 :green_heart: | shadedclient | 16m 13s | | patch has no errors
when building and testing our client artifacts. |
|||| _ Other Tests _ |
| +1 :green_heart: | unit | 231m 29s | | hadoop-hdfs in the patch
passed. |
| +1 :green_heart: | asflicense | 0m 43s | | The patch does not
generate ASF License warnings. |
| | | 315m 58s | | |
| Subsystem | Report/Notes |
|----------:|:-------------|
| Docker | ClientAPI=1.41 ServerAPI=1.41 base:
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3270/1/artifact/out/Dockerfile
|
| GITHUB PR | https://github.com/apache/hadoop/pull/3270 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall
mvnsite unit shadedclient spotbugs checkstyle codespell |
| uname | Linux a3264e739081 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6
11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | dev-support/bin/hadoop.sh |
| git revision | trunk / 193b21b7e2376b68f6fa151a2c6078f1dbde8e9a |
| Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
| Multi-JDK versions |
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04
/usr/lib/jvm/java-8-openjdk-amd64:Private
Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
| Test Results |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3270/1/testReport/ |
| Max. process+thread count | 3339 (vs. ulimit of 5500) |
| modules | C: hadoop-hdfs-project/hadoop-hdfs U:
hadoop-hdfs-project/hadoop-hdfs |
| Console output |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3270/1/console |
| versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
| Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
This message was automatically generated.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 634660)
Time Spent: 20m (was: 10m)
> TestMiniJournalCluster failing intermittently because of not reseting
> UserGroupInformation completely
> -----------------------------------------------------------------------------------------------------
>
> Key: HDFS-16154
> URL: https://issues.apache.org/jira/browse/HDFS-16154
> Project: Hadoop HDFS
> Issue Type: Improvement
> Reporter: wangzhaohui
> Assignee: wangzhaohui
> Priority: Minor
> Labels: patch-available, pull-request-available
> Attachments: HDFS-16154-001.patch
>
> Time Spent: 20m
> Remaining Estimate: 0h
>
> When I run the UT of org.apache.hadoop.hdfs.qjournal together at IDEA, there
> are many failed UT.
> I found they has the same reason ,
> {code:java}
> java.io.IOException: Running in secure mode, but config doesn't have a
> keytabjava.io.IOException: Running in secure mode, but config doesn't have a
> keytab
> at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:308) at
> org.apache.hadoop.hdfs.qjournal.server.JournalNode.start(JournalNode.java:230)
> at
> org.apache.hadoop.hdfs.qjournal.MiniJournalCluster.<init>(MiniJournalCluster.java:121)
> at
> org.apache.hadoop.hdfs.qjournal.MiniJournalCluster.<init>(MiniJournalCluster.java:48)
> at
> org.apache.hadoop.hdfs.qjournal.MiniJournalCluster$Builder.build(MiniJournalCluster.java:80)
> at
> org.apache.hadoop.hdfs.qjournal.TestMiniJournalCluster.testStartStop(TestMiniJournalCluster.java:38)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498) at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325) at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) at
> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) at
> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) at
> org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) at
> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) at
> org.junit.runners.ParentRunner.run(ParentRunner.java:363) at
> org.junit.runner.JUnitCore.run(JUnitCore.java:137) at
> org.junit.runner.JUnitCore.run(JUnitCore.java:115) at
> org.junit.vintage.engine.execution.RunnerExecutor.execute(RunnerExecutor.java:40)
> at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
> at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
> at java.util.Iterator.forEachRemaining(Iterator.java:116) at
> java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
> at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) at
> java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
> at
> java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
> at
> java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
> at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at
> java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418) at
> org.junit.vintage.engine.VintageTestEngine.executeAllChildren(VintageTestEngine.java:80)
> at
> org.junit.vintage.engine.VintageTestEngine.execute(VintageTestEngine.java:71)
> at
> org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:229)
> at
> org.junit.platform.launcher.core.DefaultLauncher.lambda$execute$6(DefaultLauncher.java:197)
> at
> org.junit.platform.launcher.core.DefaultLauncher.withInterceptedStreams(DefaultLauncher.java:211)
> at
> org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:191)
> at
> org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:128)
> at
> com.intellij.junit5.JUnit5IdeaTestRunner.startRunnerWithArgs(JUnit5IdeaTestRunner.java:71)
> at
> com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)
> at
> com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:220)
> at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:53)
> {code}
> The TestSecureNNWithQJM not reset the UGI, this affected the other unit
> tests threw {{IOException.}}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]