From: Christian Hesse <m...@eworm.de> A lot of AI bots are crawling the web these days... Try to limit traffic and load for the server by disallowing commits and diffs. --- robots.txt | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-)
diff --git a/robots.txt b/robots.txt index 1b33266..69dca6f 100644 --- a/robots.txt +++ b/robots.txt @@ -1,4 +1,6 @@ User-agent: * -Disallow: /*/snapshot/* Disallow: /*/blame/* +Disallow: /*/commit/* +Disallow: /*/diff/* +Disallow: /*/snapshot/* Allow: /