[
https://issues.apache.org/jira/browse/HADOOP-13836?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15705614#comment-15705614
]
kartheek muthyala commented on HADOOP-13836:
--------------------------------------------
[~drankye] , very good questions. Here are some of my responses. Correct me if
I am wrong. I will soon post a design doc with all these details
What's the scenarios, requirements and use cases you have in mind for this
support (other than Kerberos)?
- Avoiding man in the middle attacks, through proper SSL Connection handshake
before even exchanging the data.
- Better encryption over the wire.
- Ability to get trusted third party validation through Versign, goDaddy etc,
which improves industrial adoption for sensitive data exchange.
- We can extend the same cipher suite to encrypt data on flight and rest.
What interfaces will be taken care of by this: RPC/commands, REST, web, JDBC
and etc.
- For now we are supporting interfaces that derive from the hadoop.ipc.Server
and hadoop.ipc.Client classes. So, primarily RPC.
How authentication will be considered? Still simple or some mechanisms over
SSL/TLS?
- Today we enabled client to authenticate with the servers on connection. So
configuring keystore is a must on the server. And with a configured KeyManager
we can decide on what authentication credentials should be sent to the remote
host for authentication during SSL handshake.
How would you manage credentials (X.509 certificates) for Hadoop services and
maybe clients?
- The current work requires both server and client to be installed with
keystore and truststore, and configured through ssl-client.xml and
ssl-server.xml.
What's the exact SSL/TLS versions to support and how to configure such with the
cipher suite options?
- Currently we are supporting TLSv1.2 as a default. Because the cipher suite
hasn't changed between TLSv1 and TLSv1.2, TLSv1 also should be supported. Given
that the SSLContext varies for different versions of SSL/TLS, we can provide an
interface for deriving this SSLContext depending upon the version of TLS/SSL
configured.
> Securing Hadoop RPC using SSL
> -----------------------------
>
> Key: HADOOP-13836
> URL: https://issues.apache.org/jira/browse/HADOOP-13836
> Project: Hadoop Common
> Issue Type: New Feature
> Components: ipc
> Reporter: kartheek muthyala
> Assignee: kartheek muthyala
> Attachments: HADOOP-13836.patch
>
>
> Today, RPC connections in Hadoop are encrypted using Simple Authentication &
> Security Layer (SASL), with the Kerberos ticket based authentication or
> Digest-md5 checksum based authentication protocols. This proposal is about
> enhancing this cipher suite with SSL/TLS based encryption and authentication.
> SSL/TLS is a proposed Internet Engineering Task Force (IETF) standard, that
> provides data security and integrity across two different end points in a
> network. This protocol has made its way to a number of applications such as
> web browsing, email, internet faxing, messaging, VOIP etc. And supporting
> this cipher suite at the core of Hadoop would give a good synergy with the
> applications on top and also bolster industry adoption of Hadoop.
> The Server and Client code in Hadoop IPC should support the following modes
> of communication
> 1. Plain
> 2. SASL encryption with an underlying authentication
> 3. SSL based encryption and authentication (x509 certificate)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]