[ 
https://issues.apache.org/jira/browse/HADOOP-3249?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12589227#action_12589227
 ] 

Martin Boeker commented on HADOOP-3249:
---------------------------------------

Christophe,

Thanks for the suggestion. It's acceptable to have people enter a proxy server 
into their browser settings to view the DFS report page, but I can't ask them 
to use ssh to create their own proxy 'server' -- one of the reasons for that is 
that ssh-ing to the gateway requires a user account which is not an option 
(security reasons, not something I can do anything about).

Can I run a proxy server on the gateway somehow? Another option would be if I 
could tell dfshealth.jsp that the url for each node is something like this:
http://gateway_external_IP:10001/, http://gateway_external_IP:10002/, etc 
instead of http://xenhost-123.../, but of course I would want this change to 
only show up in the html interface, the rest of the hadoop setup should still 
keep the hostnames and IP addresses unchanged.

One thing I've noticed is that the only place I find a dfshealth.jsp file is 
under /hadoop/src/webapps/dfs, but making changes to that file doesn't affect 
the port 50070 interface.. I've noticed though that 
/hadoop/webapps/dfs/index.html is the page that forwards at port 50070, but I 
can't figure out where the dfshealth.jsp is that is shown through the web 
interface.

Sorry if that's kind of confusing, I'm in the middle of a meeting while writing 
this! Thank you again for your feedback.

-Martin

> Browsing DFS behind gateway
> ---------------------------
>
>                 Key: HADOOP-3249
>                 URL: https://issues.apache.org/jira/browse/HADOOP-3249
>             Project: Hadoop Core
>          Issue Type: Wish
>          Components: dfs
>    Affects Versions: 0.16.0, 0.16.1, 0.16.2
>         Environment: Red-Hat cluster
>            Reporter: Martin Boeker
>   Original Estimate: 5h
>  Remaining Estimate: 5h
>
> Dear Hadoop guys,
> I'm urgently trying to make a way for users to be able to see the contents of 
> a Hadoop DFS that is behind a gateway. I'm using port forwarding on the 
> gateway itself to point to the DFS web interface, something like this:
> [gateway_external_IP]:50070 >> [node_internal_IP]:50070
> This works fine, if I go to http://gateway_external_ip:50070/ I can view the 
> DFS cluster html page from the outside world. The problem is that if I click 
> on any of the slave node links, it forwards to http://node_hostname/.., which 
> obviously doesn't work. I really need to get this going, a couple of projects 
> require this to be implemented.
> I'm willing to do this any way possible, I don't really need to use the 50070 
> web interface, even a simple directory structure would do, but I'm not sure 
> how to implement that either, because I don't know of a way to make an httpd 
> or ftpd use "bin/hadoop dfs -lsr /" as the root directory. I'd also be 
> willing to make people use a proxy server if that would fix my issue somehow..
> If anyone can help, I would greatly appreciate it, like I said it's kind of 
> urgent and I'm running out of ideas to try..
> Thanks a lot in advance,
> -Martin

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to