Give the forgotten reference: [1] https://github.com/apache/directory-kerby
-----Original Message----- From: Zheng, Kai [mailto:kai.zh...@intel.com] Sent: Friday, January 29, 2016 9:10 AM To: hdfs-dev@hadoop.apache.org Subject: RE: Hadoop encryption module as Apache Chimera incubator project Sounds good to have further discussions. Mind I have some questions, thanks. @Haifeng: Thanks Uma & Haifeng for your answers about how to scope and vision Chimera. It sounds good to me. So I guess we would prefer to use a generic project name like Chimera to make the project not tightly coupled with the AES encryption things? Would this new project also consider even more general efforts currently not in the big data scope yet? I mean, in ASF, there are various security related projects and many projects that relate to or heavily use security things. Looks like Chimera can focus on and provide high performance security libraries and facilities, it would be good if these projects can also benefit from Chimera as well as Hadoop and Spark. If Chimera would reside in Hadoop, I'm personally wish it could be independent from the main part in codebase and dependency relationships. That means, if other security project would like to use Chimera, then it won't have to rely on Hadoop modules, like in hadoop-common. Otherwise, it will make some messy because in some time future, Hadoop may leverage these security projects to enhance security. For example, Apache Kerby[1] I'd like to mention, it provides almost full Kerberos encryption types compatible with MIT KDC but the underlying encryption ciphers are mainly in JRE, which can be optimized using Chimera. I understand Hadoop specific security issues should go to security@hadoop. How about the general ones, I know there is a mailing list in ASF for security things. This project may support other platforms like Windows. Will Chimera bundle native libraries like OpenSSL in the JAR? As I went through the discussions in HADOOP-11127 as guided by Chris, looks like a challenge thing would be to how to build and bundle various native libraries for the supported platforms with versioning in mind, wherever it's hosted. Maybe another option, have Chimera as a separate project as Yetus? It can still be managed by the committee. :) Thanks for your answers. Regards, Kai -----Original Message----- From: Gangumalla, Uma [mailto:uma.ganguma...@intel.com] Sent: Thursday, January 28, 2016 4:08 AM To: hdfs-dev@hadoop.apache.org Subject: Re: Hadoop encryption module as Apache Chimera incubator project Thanks for the inputs Owen. On 1/27/16, 11:31 AM, "Owen O'Malley" <omal...@apache.org> wrote: >On Wed, Jan 27, 2016 at 9:59 AM, Gangumalla, Uma ><uma.ganguma...@intel.com> >wrote: > >> I think Chimera goal is to enhance even for other use cases. > > >Naturally. > > >> For Hadoop, CTR mode should be enough today, > > >This isn't true. Hadoop should use better encryption for RPC and >shuffle, both of which should not use CTR. || Yes, I said later Hadoop could use other options too. > > >> I think separate module and >> independent release is good idea but I am not so strong on the point >> to keep under Hadoop. > > >I believe encryption is becoming a core part of Hadoop. I think that >moving core components out of Hadoop is bad from a project management >perspective. >To put it another way, a bug in the encryption routines will likely >become a security problem that security@hadoop needs to hear about. I >don't think adding a separate project in the middle of that >communication chain is a good idea. The same applies to data corruption >problems, and so on... || I agree on security related discussion we have separate one. Thanks || for this point. > > >> It may be good to keep at generalized place(As in the discussion, we >> thought that place could be Apache Commons). > > >Apache Commons is a collection of *Java* projects, so Chimera as a >JNI-based library isn't a natural fit. Furthermore, Apache Commons >doesn't have its own security list so problems will go to the generic >secur...@apache.org. ||I see some projects including native stuff too. Example: Commons-daemon. ||But, yeah I noticed now Apache commons proper is indicating that for reusable Java sources. > >Why do you think that Apache Commons is a better home than Hadoop? > >.. Owen @ATM, Andrew, Chris, Yi do you want to comment on this proposal? Regards, Uma