[
https://issues.apache.org/jira/browse/HADOOP-1822?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Christophe Taton updated HADOOP-1822:
-------------------------------------
Attachment: 1822_2007-09-22_5.patch
New patch:
- Different Socket factories can be specified for different contexts (e.g. for
different VersionedProtocols like {{JobSubmissionProtocol}} or
{{ClientProtocol}}).
- The DFS interface (ClientProtocol RPCs and sockets for transferring data
blocs with DataNodes) uses a {{SocketFactory}} configured with
{{hadoop.rpc.socket.factory.class.ClientProtocol}}.
- The JobTracker interface (JobSubmissionProtocol RPCs) Socket factory is
configured using {{hadoop.rpc.socket.factory.class.JobSubmissionProtocol}}.
- All other created sockets are created using the default Socket Factory as
specified by {{hadoop.rpc.socket.factory.class.default}}.
The configuration property that specifies a Socket factory follows the syntax:
"{{package.FactoryClassName:factory-parameter}}".
The patch provides a {{ProxySocketFactory}} initialized with a factory
parameter which can be:
- {{direct}}
- {{socks://host:port}}
- {{http://host:port}}
When implementing a new factory, the new factory class should provide a
{{public static SocketFactory initialize(String factoryParameter)}} method.
RPC Clients connections are cached based on the {{SocketFactory}} used to
create them.
JUnit tests ok.
> Allow SOCKS proxy configuration to remotely access the DFS and submit Jobs
> --------------------------------------------------------------------------
>
> Key: HADOOP-1822
> URL: https://issues.apache.org/jira/browse/HADOOP-1822
> Project: Hadoop
> Issue Type: New Feature
> Components: dfs, ipc
> Reporter: Christophe Taton
> Assignee: Christophe Taton
> Priority: Minor
> Fix For: 0.15.0
>
> Attachments: 1822-2007-09-07a.patch, 1822-2007-09-11a.patch,
> 1822_2007-09-22_5.patch
>
>
> The purpose of this issue is to introduce a new configuration entry to setup
> SOCKS proxy for DFS and JobTracker clients.
> This enable users to remotely access the DFS and submit Jobs as if they were
> directly connected to the cluster Hadoop runs on.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.