[
https://issues.apache.org/jira/browse/HDFS-16004?focusedWorklogId=591409&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-591409
]
ASF GitHub Bot logged work on HDFS-16004:
-----------------------------------------
Author: ASF GitHub Bot
Created on: 30/Apr/21 09:01
Start Date: 30/Apr/21 09:01
Worklog Time Spent: 10m
Work Description: hadoop-yetus commented on pull request #2966:
URL: https://github.com/apache/hadoop/pull/2966#issuecomment-829952357
:broken_heart: **-1 overall**
| Vote | Subsystem | Runtime | Logfile | Comment |
|:----:|----------:|--------:|:--------:|:-------:|
| +0 :ok: | reexec | 0m 32s | | Docker mode activated. |
|||| _ Prechecks _ |
| +1 :green_heart: | dupname | 0m 0s | | No case conflicting files
found. |
| +0 :ok: | codespell | 0m 0s | | codespell was not available. |
| +1 :green_heart: | @author | 0m 0s | | The patch does not contain
any @author tags. |
| -1 :x: | test4tests | 0m 0s | | The patch doesn't appear to include
any new or modified tests. Please justify why no new tests are needed for this
patch. Also please list what manual steps were performed to verify this patch.
|
|||| _ trunk Compile Tests _ |
| +1 :green_heart: | mvninstall | 34m 38s | | trunk passed |
| +1 :green_heart: | compile | 1m 21s | | trunk passed with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 |
| +1 :green_heart: | compile | 1m 16s | | trunk passed with JDK
Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 |
| +1 :green_heart: | checkstyle | 1m 1s | | trunk passed |
| +1 :green_heart: | mvnsite | 1m 24s | | trunk passed |
| +1 :green_heart: | javadoc | 0m 55s | | trunk passed with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 |
| +1 :green_heart: | javadoc | 1m 26s | | trunk passed with JDK
Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 |
| +1 :green_heart: | spotbugs | 3m 2s | | trunk passed |
| +1 :green_heart: | shadedclient | 16m 1s | | branch has no errors
when building and testing our client artifacts. |
|||| _ Patch Compile Tests _ |
| +1 :green_heart: | mvninstall | 1m 13s | | the patch passed |
| +1 :green_heart: | compile | 1m 11s | | the patch passed with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 |
| +1 :green_heart: | javac | 1m 11s | | the patch passed |
| +1 :green_heart: | compile | 1m 4s | | the patch passed with JDK
Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 |
| +1 :green_heart: | javac | 1m 4s | | the patch passed |
| +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks
issues. |
| +1 :green_heart: | checkstyle | 0m 52s | | the patch passed |
| +1 :green_heart: | mvnsite | 1m 11s | | the patch passed |
| +1 :green_heart: | javadoc | 0m 45s | | the patch passed with JDK
Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04 |
| +1 :green_heart: | javadoc | 1m 19s | | the patch passed with JDK
Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 |
| +1 :green_heart: | spotbugs | 3m 7s | | the patch passed |
| +1 :green_heart: | shadedclient | 16m 4s | | patch has no errors
when building and testing our client artifacts. |
|||| _ Other Tests _ |
| -1 :x: | unit | 239m 55s |
[/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2966/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt)
| hadoop-hdfs in the patch passed. |
| +1 :green_heart: | asflicense | 0m 39s | | The patch does not
generate ASF License warnings. |
| | | 326m 50s | | |
| Reason | Tests |
|-------:|:------|
| Failed junit tests | hadoop.hdfs.server.datanode.TestDirectoryScanner |
| | hadoop.hdfs.TestReconstructStripedFileWithValidator |
| | hadoop.hdfs.server.blockmanagement.TestUnderReplicatedBlocks |
| | hadoop.hdfs.qjournal.server.TestJournalNodeRespectsBindHostKeys |
| | hadoop.hdfs.TestMultipleNNPortQOP |
| | hadoop.hdfs.tools.offlineEditsViewer.TestOfflineEditsViewer |
| | hadoop.hdfs.server.namenode.snapshot.TestNestedSnapshots |
| Subsystem | Report/Notes |
|----------:|:-------------|
| Docker | ClientAPI=1.41 ServerAPI=1.41 base:
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2966/1/artifact/out/Dockerfile
|
| GITHUB PR | https://github.com/apache/hadoop/pull/2966 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall
mvnsite unit shadedclient spotbugs checkstyle codespell |
| uname | Linux 7df22da5b480 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9
23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | dev-support/bin/hadoop.sh |
| git revision | trunk / 17a8bee6b558d7fdbef03307279d92f0e0df7fa5 |
| Default Java | Private Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 |
| Multi-JDK versions |
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.10+9-Ubuntu-0ubuntu1.20.04
/usr/lib/jvm/java-8-openjdk-amd64:Private
Build-1.8.0_282-8u282-b08-0ubuntu1~20.04-b08 |
| Test Results |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2966/1/testReport/ |
| Max. process+thread count | 3012 (vs. ulimit of 5500) |
| modules | C: hadoop-hdfs-project/hadoop-hdfs U:
hadoop-hdfs-project/hadoop-hdfs |
| Console output |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-2966/1/console |
| versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
| Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
This message was automatically generated.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 591409)
Time Spent: 20m (was: 10m)
> startLogSegment and journal in BackupNode lack Permission check.
> ----------------------------------------------------------------
>
> Key: HDFS-16004
> URL: https://issues.apache.org/jira/browse/HDFS-16004
> Project: Hadoop HDFS
> Issue Type: Bug
> Reporter: lujie
> Priority: Critical
> Labels: pull-request-available
> Time Spent: 20m
> Remaining Estimate: 0h
>
> I have some doubt when i configurate secure HDFS. I know we have Service
> Level Authorization for protocols like NamenodeProtocol,DatanodeProtocol and
> so on.
> But i do not find such Authorization for JournalProtocol after reading the
> code in HDFSPolicyProvider. And if we have, how can i configurate such
> Authorization?
>
> Besides even NamenodeProtocol has Service Level Authorization, its methods
> still have Permission check. Take startCheckpoint in NameNodeRpcServer who
> implemented NamenodeProtocol for example:
>
> _public NamenodeCommand startCheckpoint(NamenodeRegistration registration)_
> _throws IOException {_
> _String operationName = "startCheckpoint";_
> _checkNNStartup();_
> _{color:#ff6600}namesystem.checkSuperuserPrivilege(operationName);{color}_
> _......_
>
> I found that the methods in BackupNodeRpcServer who implemented
> JournalProtocol lack of such Permission check. See below:
>
>
> _public void startLogSegment(JournalInfo journalInfo, long epoch,_
> _long txid) throws IOException {_
> _namesystem.checkOperation(OperationCategory.JOURNAL);_
> _verifyJournalRequest(journalInfo);_
> _getBNImage().namenodeStartedLogSegment(txid);_
> _}_
>
> _@Override_
> _public void journal(JournalInfo journalInfo, long epoch, long firstTxId,_
> _int numTxns, byte[] records) throws IOException {_
> _namesystem.checkOperation(OperationCategory.JOURNAL);_
> _verifyJournalRequest(journalInfo);_
> _getBNImage().journal(firstTxId, numTxns, records);_
> _}_
>
> Do we need add Permission check for them?
>
> Please point out my mistakes if i am wrong or miss something.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]