[
https://issues.apache.org/jira/browse/HADOOP-10310?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13887744#comment-13887744
]
Hudson commented on HADOOP-10310:
---------------------------------
SUCCESS: Integrated in Hadoop-Hdfs-trunk #1659 (See
[https://builds.apache.org/job/Hadoop-Hdfs-trunk/1659/])
HADOOP-10310. SaslRpcServer should be initialized even when no secret manager
present. Contributed by Aaron T. Myers. (atm:
http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1562863)
* /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
*
/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java
> SaslRpcServer should be initialized even when no secret manager present
> -----------------------------------------------------------------------
>
> Key: HADOOP-10310
> URL: https://issues.apache.org/jira/browse/HADOOP-10310
> Project: Hadoop Common
> Issue Type: Bug
> Components: security
> Affects Versions: 2.3.0
> Reporter: Aaron T. Myers
> Assignee: Aaron T. Myers
> Priority: Blocker
> Fix For: 2.3.0
>
> Attachments: HADOOP-10310.patch
>
>
> HADOOP-8783 made a change which caused the SaslRpcServer not to be
> initialized if there is no secret manager present. This works fine for most
> Hadoop daemons because they need a secret manager to do their business, but
> JournalNodes do not. The result of this is that JournalNodes are broken and
> will not handle RPCs in a Kerberos-enabled environment, since the
> SaslRpcServer will not be initialized.
--
This message was sent by Atlassian JIRA
(v6.1.5#6160)