On 3/18/19 7:43 AM, Stephen John Smoogen wrote:
> So in looking at a problem with greenwave getting hit with various
> web-bots I found that very few of our configured robots.txt for sites
> work because we normally proxypass everything to the backend. This
> causes extra work on each backend needing to either add this to their
> codebase (if they are a flask containers) and could be slow for other
> reasons. However this will change all robots.conf for all websites so
> I don't want to look at a FBR without discussion.
>
> diff --git a/roles/httpd/website/templates/robots.conf
> b/roles/httpd/website/templates/robots.conf
> index f442128..a59eb68 100644
> --- a/roles/httpd/website/templates/robots.conf
> +++ b/roles/httpd/website/templates/robots.conf
> @@ -1 +1,7 @@
> +## Make sure that we don't skip this because we proxy pass it to a
> +## slow backend
> +<Location "/robots.txt">
> + ProxyPass !
> +</Location>
> +
> Alias /robots.txt /srv/web/{{site_name}}-robots.txt+1 here. kevin
signature.asc
Description: OpenPGP digital signature
_______________________________________________ infrastructure mailing list -- [email protected] To unsubscribe send an email to [email protected] Fedora Code of Conduct: https://getfedora.org/code-of-conduct.html List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines List Archives: https://lists.fedoraproject.org/archives/list/[email protected]
