[
https://issues.apache.org/jira/browse/HADOOP-16259?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16824161#comment-16824161
]
Kai Xie commented on HADOOP-16259:
----------------------------------
class FSProtos is generated from protobuf
[FSProtos.proto|https://github.com/apache/hadoop/blob/trunk/hadoop-common-project/hadoop-common/src/main/proto/FSProtos.proto].
when you did `mvn install`, the maven plugin will compile the [protobuf
file|https://github.com/apache/hadoop/blob/trunk/hadoop-common-project/hadoop-common/pom.xml#L411].
> Distcp to set S3 Storage Class
> ------------------------------
>
> Key: HADOOP-16259
> URL: https://issues.apache.org/jira/browse/HADOOP-16259
> Project: Hadoop Common
> Issue Type: New Feature
> Components: hadoop-aws, tools/distcp
> Affects Versions: 2.8.4
> Reporter: Prakash Gopalsamy
> Priority: Minor
> Attachments: ENHANCE_HADOOP_DISTCP_FOR_CUSTOM_S3_STORAGE_CLASS.docx
>
> Original Estimate: 168h
> Remaining Estimate: 168h
>
> Hadoop distcp implementation doesn’t have properties to override Storage
> class while transferring data to Amazon S3 storage. Hadoop distcp doesn’t set
> any storage class while transferring data to Amazon S3 storage. Due to this
> all the objects moved from cluster to S3 using Hadoop Distcp are been stored
> in the default storage class “STANDARD”. By providing a new feature to
> override the default S3 storage class through configuration properties will
> be helpful to upload objects in other storage classes. I have come up with a
> design to implement this feature in a design document and uploaded the same
> in the JIRA. Kindly review and let me know for your suggestions.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]