Hi,

SetEnvIf user-agent "(?i:TurnitinBot)" SOMENAME1
SetEnvIf Request_URI "^linuxsecurity_features\.*$" SOMENAME2

And let it meet all the requirements.

<RequireAll>
   Require all granted
   Require not env SOMENAME1
   Require env SOMENAME2
</RequireAll>

This had the effect of blocking elements on every page (and the page itself, I think) on 
the site with 403s and "AH01630: client denied by server configuration:" 
entries in the error log.

Is it possible there's something else going on here?

SetEnvIf Request_URI "^linuxsecurity_features\.*$" rssfeeds
Can't anchor to ^. Unlike rewrite in htaccess, this will always
compare against the actual requested URL. Not the weird remainder of
the URL.

There's something more going on than just an errant caret.

Once the "Require env SOMENAME2" is included, as above, it immediately starts to 403 every page on the site. It's like each Require above is being considered independently, or that it somehow supersedes the previous Require.

With the last Require commented out, it works as expected (blocking all bots listed in the SetEnvIf), with the exception that it also restricts libwww access to the RSS feeds.

dave





Reply via email to