That's the correct configuration, and it should disallow all bots from all 
files in that tree. Robot exclusion files are just a request, though, and not 
compulsory; if that particular user-agent becomes a real issue, then it may be 
better to find a way to block it through your HTTP server.

Jim Redmond
Department of Developmental Biology
Washington University in St. Louis
jredm...@wustl.edu

-----Original Message-----
From: galaxy-dev-boun...@lists.bx.psu.edu 
[mailto:galaxy-dev-boun...@lists.bx.psu.edu] On Behalf Of SHAUN WEBB
Sent: Monday, December 12, 2011 8:40 AM
To: galaxy dev
Subject: [galaxy-dev] googlebot


Hi,

my paster.log is full of errors relating to a googlebot trying to access a 
library file.

I noticed a robots.txt file in the static folder configured as such.

User-agent: *
Disallow: /

I thought this would stop all bots accessing all files. Should this be 
configured differently?

Thanks
Shaun

--
The University of Edinburgh is a charitable body, registered in Scotland, with 
registration number SC005336.


___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/

___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/

Reply via email to