[
https://issues.apache.org/jira/browse/HADOOP-2239?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12569029#action_12569029
]
Doug Cutting commented on HADOOP-2239:
--------------------------------------
This can be triggered when an "hsftp:" uri is used.
The following would be required to implement this:
# Extend the namenode and datanode's http servers to respond to https
connections, adding new configuration properties to name https ports, and
configuring jetty accordingly (adding a Jetty SslListener in our
StatusHttpServer, specifying certs, etc.).
# Change HftpFileSystem to use https connections when the "hsftp" scheme is
used.
# Change FileDataServlet to redirect to https when https is used to access it.
Does that sound right?
> Security: Need to be able to encrypt Hadoop socket connections
> ---------------------------------------------------------------
>
> Key: HADOOP-2239
> URL: https://issues.apache.org/jira/browse/HADOOP-2239
> Project: Hadoop Core
> Issue Type: Bug
> Components: dfs
> Reporter: Allen Wittenauer
>
> We need to be able to use hadoop over hostile networks, both internally and
> externally to the enterpise. While authentication prevents unauthorized
> access, encryption should be used to prevent such things as packet snooping
> across the wire. This means that hadoop client connections, distcp, etc,
> would use something such as SSL to protect the TCP/IP packets.
> Post-Kerberos, it would be useful to use something similar to NFS's krb5p
> option.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.