In <[EMAIL PROTECTED]>, Andrew Daviel 
<[EMAIL PROTECTED]> writes:
> Some Web Servers use a file called /robot(s).txt to make search engines
> and any other indexing tools visit their WebPages more frequently and more
> efficiently. By connecting to the server and requesting the /robot(s).txt
> file, an attacker may gain additional information about the system they
> are attacking. Such information as, restricted directories, hidden
> directories, cgi script directories and etc. Take special care not to tell
> the robots not to index sensitive directories, since this tells attackers
> exactly which of your directories are sensitive.

The risk is webmasters not protecting sensitive resources properly but only
trying to obsure them.

robots.txt is not a place for listing sensitive directories but for helping
robots avoid indexing irrelevant information or going into endless loops on
dynamic pages.

There has been some discussions in comp.risks in 1998 which is summarized at
http://www.eiffel.com/private/meyer/robots.html

Klaus Johannes Rusch
--
[EMAIL PROTECTED]
http://www.atmedia.net/KlausRusch/

Reply via email to