[
https://issues.apache.org/jira/browse/HDFS-5143?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13821631#comment-13821631
]
Hadoop QA commented on HDFS-5143:
---------------------------------
{color:red}-1 overall{color}. Here are the results of testing the latest
attachment
http://issues.apache.org/jira/secure/attachment/12613629/CryptographicFileSystem.patch
against trunk revision .
{color:green}+1 @author{color}. The patch does not contain any @author
tags.
{color:green}+1 tests included{color}. The patch appears to include 1 new
or modified test files.
{color:green}+1 javac{color}. The applied patch does not increase the
total number of javac compiler warnings.
{color:green}+1 javadoc{color}. The javadoc tool did not generate any
warning messages.
{color:green}+1 eclipse:eclipse{color}. The patch built with
eclipse:eclipse.
{color:green}+1 findbugs{color}. The patch does not introduce any new
Findbugs (version 1.3.9) warnings.
{color:green}+1 release audit{color}. The applied patch does not increase
the total number of release audit warnings.
{color:red}-1 core tests{color}. The patch failed these unit tests in
hadoop-common-project/hadoop-common hadoop-common-project/hadoop-crypto
hadoop-dist hadoop-hdfs-project/hadoop-hdfs:
org.apache.hadoop.conf.TestConfiguration
{color:green}+1 contrib tests{color}. The patch passed contrib unit tests.
Test results:
https://builds.apache.org/job/PreCommit-HDFS-Build/5422//testReport/
Console output: https://builds.apache.org/job/PreCommit-HDFS-Build/5422//console
This message is automatically generated.
> Hadoop cryptographic file system
> --------------------------------
>
> Key: HDFS-5143
> URL: https://issues.apache.org/jira/browse/HDFS-5143
> Project: Hadoop HDFS
> Issue Type: New Feature
> Components: security
> Affects Versions: 3.0.0
> Reporter: Yi Liu
> Assignee: Owen O'Malley
> Labels: rhino
> Fix For: 3.0.0
>
> Attachments: CryptographicFileSystem.patch, HADOOP cryptographic file
> system.pdf
>
>
> There is an increasing need for securing data when Hadoop customers use
> various upper layer applications, such as Map-Reduce, Hive, Pig, HBase and so
> on.
> HADOOP CFS (HADOOP Cryptographic File System) is used to secure data, based
> on HADOOP “FilterFileSystem” decorating DFS or other file systems, and
> transparent to upper layer applications. It’s configurable, scalable and fast.
> High level requirements:
> 1. Transparent to and no modification required for upper layer
> applications.
> 2. “Seek”, “PositionedReadable” are supported for input stream of CFS if
> the wrapped file system supports them.
> 3. Very high performance for encryption and decryption, they will not
> become bottleneck.
> 4. Can decorate HDFS and all other file systems in Hadoop, and will not
> modify existing structure of file system, such as namenode and datanode
> structure if the wrapped file system is HDFS.
> 5. Admin can configure encryption policies, such as which directory will
> be encrypted.
> 6. A robust key management framework.
> 7. Support Pread and append operations if the wrapped file system supports
> them.
--
This message was sent by Atlassian JIRA
(v6.1#6144)