[
https://issues.apache.org/jira/browse/HADOOP-11485?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14284395#comment-14284395
]
Allen Wittenauer commented on HADOOP-11485:
-------------------------------------------
Let's say I'm an enterprise user that wants to use three other commercial
products with my commercial Hadoop distribution: an input format, an encryption
library, and an auth handler. Today, doing this requires the Hadoop admin to
modify HADOOP_USER_CLASSPATH or shove the contents into the Hadoop binary
distribution directories in order to get the daemons to always have these jars
available.
HADOOP_USER_CLASSPATH is extra bad because this means that the end users of the
system will also need to take care that they don't drop these extra classpaths
if they also need custom jars for their job.
By making this pluggable, 3rd parties can inject their bits in a MUCH safer way
plus the end user controls are left open for them to do whatever with.
> Pluggable shell integration
> ---------------------------
>
> Key: HADOOP-11485
> URL: https://issues.apache.org/jira/browse/HADOOP-11485
> Project: Hadoop Common
> Issue Type: New Feature
> Components: scripts
> Affects Versions: 3.0.0
> Reporter: Allen Wittenauer
> Assignee: Allen Wittenauer
> Labels: scripts, shell
> Attachments: HADOOP-11485-00.patch, HADOOP-11485-01.patch,
> HADOOP-11485-02.patch
>
>
> It would be useful to provide a way for core and non-core Hadoop components
> to plug into the shell infrastructure. This would allow us to pull the HDFS,
> MapReduce, and YARN shell functions out of hadoop-functions.sh.
> Additionally, it should let 3rd parties such as HBase influence things like
> classpaths at runtime.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)