[ 
https://issues.apache.org/jira/browse/HADOOP-3249?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12589235#action_12589235
 ] 

Raghu Angadi commented on HADOOP-3249:
--------------------------------------

bq. It's acceptable to have people enter a proxy server into their browser 
settings to view the DFS report page, but I can't ask them to use ssh to create 
their own proxy 'server' - one of the reasons for that is that ssh-ing to the 
gateway requires a user account which is not an option (security reasons, not 
something I can do anything about).

Proxy needs to be set up only once (by one user). SSH is just one way of 
setting up a SOCKS proxy. You could run some other proxy. If you do use 'ssh 
-D', make sure SSH accepts connections from external hosts.

> Can I run a proxy server on the gateway somehow?
Yes. SSH -D works very well.

> Browsing DFS behind gateway
> ---------------------------
>
>                 Key: HADOOP-3249
>                 URL: https://issues.apache.org/jira/browse/HADOOP-3249
>             Project: Hadoop Core
>          Issue Type: Wish
>          Components: dfs
>    Affects Versions: 0.16.0, 0.16.1, 0.16.2
>         Environment: Red-Hat cluster
>            Reporter: Martin Boeker
>   Original Estimate: 5h
>  Remaining Estimate: 5h
>
> Dear Hadoop guys,
> I'm urgently trying to make a way for users to be able to see the contents of 
> a Hadoop DFS that is behind a gateway. I'm using port forwarding on the 
> gateway itself to point to the DFS web interface, something like this:
> [gateway_external_IP]:50070 >> [node_internal_IP]:50070
> This works fine, if I go to http://gateway_external_ip:50070/ I can view the 
> DFS cluster html page from the outside world. The problem is that if I click 
> on any of the slave node links, it forwards to http://node_hostname/.., which 
> obviously doesn't work. I really need to get this going, a couple of projects 
> require this to be implemented.
> I'm willing to do this any way possible, I don't really need to use the 50070 
> web interface, even a simple directory structure would do, but I'm not sure 
> how to implement that either, because I don't know of a way to make an httpd 
> or ftpd use "bin/hadoop dfs -lsr /" as the root directory. I'd also be 
> willing to make people use a proxy server if that would fix my issue somehow..
> If anyone can help, I would greatly appreciate it, like I said it's kind of 
> urgent and I'm running out of ideas to try..
> Thanks a lot in advance,
> -Martin

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to