Thanks. Do I need to have Galaxy running via apache for this to take effect?
Quoting "Redmond, James" <[email protected]> on Mon, 12 Dec 2011
18:30:02 +0000:
That's the correct configuration, and it should disallow all bots
from all files in that tree. Robot exclusion files are just a
request, though, and not compulsory; if that particular user-agent
becomes a real issue, then it may be better to find a way to block
it through your HTTP server.
Jim Redmond
Department of Developmental Biology
Washington University in St. Louis
[email protected]
-----Original Message-----
From: [email protected]
[mailto:[email protected]] On Behalf Of SHAUN WEBB
Sent: Monday, December 12, 2011 8:40 AM
To: galaxy dev
Subject: [galaxy-dev] googlebot
Hi,
my paster.log is full of errors relating to a googlebot trying to
access a library file.
I noticed a robots.txt file in the static folder configured as such.
User-agent: *
Disallow: /
I thought this would stop all bots accessing all files. Should this
be configured differently?
Thanks
Shaun
--
The University of Edinburgh is a charitable body, registered in
Scotland, with registration number SC005336.
___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client. To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
http://lists.bx.psu.edu/
___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client. To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
http://lists.bx.psu.edu/
--
The University of Edinburgh is a charitable body, registered in
Scotland, with registration number SC005336.
___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client. To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
http://lists.bx.psu.edu/