[
https://issues.apache.org/jira/browse/HADOOP-12758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15129413#comment-15129413
]
Larry McCay commented on HADOOP-12758:
--------------------------------------
Hi [~anu] - I believe that the behavior differences that you describe are
certainly possible but that the impact of breaking existing clients that aren't
even vulnerable to or a source of the attack that we are protected against is
worse.
The error returned via:
{quote}
((HttpServletResponse)response).sendError(
HttpServletResponse.SC_BAD_REQUEST,
"Missing Required Header for Vulnerability Protection");
{quote}
Should provide some level of diagnostic clarity.
If we find that we need more than we can do more there.
Being so strict with this enforcement that scripting with curl, groovy,
java/python/perl clients all break will likely result in the filter not being
used instead of clients being changed.
Does this make sense?
> Extend CSRF Filter with UserAgent Checks
> ----------------------------------------
>
> Key: HADOOP-12758
> URL: https://issues.apache.org/jira/browse/HADOOP-12758
> Project: Hadoop Common
> Issue Type: Bug
> Components: security
> Reporter: Larry McCay
> Assignee: Larry McCay
> Fix For: 2.8.0
>
> Attachments: HADOOP-12758-001.patch
>
>
> To protect against CSRF attacks, HADOOP-12691 introduces a CSRF filter that
> will require a specific HTTP header to be sent with every REST API call. This
> will affect all API consumers from web apps to CLIs and curl.
> Since CSRF is primarily a browser based attack we can try and minimize the
> impact on non-browser clients.
> This enhancement will provide additional configuration for identifying
> non-browser useragents and skipping the enforcement of the header requirement
> for anything identified as a non-browser. This will largely limit the impact
> to browser based PUT and POST calls when configured appropriately.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)