[
https://issues.apache.org/jira/browse/HDFS-330?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Jakob Homan updated HDFS-330:
-----------------------------
Summary: Datanode Web UIs should provide robots.txt (was: Web
UIs should provide robots.txt)
Assignee: Allen Wittenauer
Fix Version/s: 0.22.0
Affects Version/s: 0.22.0
(was: 0.20.2)
Component/s: data-node
> Datanode Web UIs should provide robots.txt
> ------------------------------------------
>
> Key: HDFS-330
> URL: https://issues.apache.org/jira/browse/HDFS-330
> Project: Hadoop HDFS
> Issue Type: Improvement
> Components: data-node
> Affects Versions: 0.22.0
> Reporter: Allen Wittenauer
> Assignee: Allen Wittenauer
> Priority: Trivial
> Fix For: 0.22.0
>
> Attachments: HDFS-330.txt
>
>
> There is a potential issue that someone might have an internal corporate
> crawler that goes through HDFS browser accidentally. It might be a good idea
> to provide a default robots file that disables crawling. [No, this didn't
> happen to us. :) ]
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.