[
https://issues.apache.org/jira/browse/HADOOP-12758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15134441#comment-15134441
]
Chris Nauroth commented on HADOOP-12758:
----------------------------------------
[~lmccay], the plan sounds good.
bq. Do we need to allow for crawling of webpages without CSRF protection?
I expect it's not an issue, given that our use case covers internal back-end
web services, and not public Internet sites that would benefit from allowing
crawlers. If anyone really is crawling Hadoop web services, then I expect they
have control of that client and so can either code it to set the header or
reconfigure the regex to allow it.
> Extend CSRF Filter with UserAgent Checks
> ----------------------------------------
>
> Key: HADOOP-12758
> URL: https://issues.apache.org/jira/browse/HADOOP-12758
> Project: Hadoop Common
> Issue Type: Bug
> Components: security
> Reporter: Larry McCay
> Assignee: Larry McCay
> Attachments: HADOOP-12758-001.patch, HADOOP-12758-002.patch
>
>
> To protect against CSRF attacks, HADOOP-12691 introduces a CSRF filter that
> will require a specific HTTP header to be sent with every REST API call. This
> will affect all API consumers from web apps to CLIs and curl.
> Since CSRF is primarily a browser based attack we can try and minimize the
> impact on non-browser clients.
> This enhancement will provide additional configuration for identifying
> non-browser useragents and skipping the enforcement of the header requirement
> for anything identified as a non-browser. This will largely limit the impact
> to browser based PUT and POST calls when configured appropriately.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)