> On one hand I want to put in a minimal robots.txt file so that my error
> logs don't fill up with hordes of missing file errors generated by
> spiders. On the other hand, I don't want its existence to be a flag to a
> ...
> User-agent: *
> Disallow: /templates/
> Disallow: /classes/

I don't think you have any reason to worry about hackers using that info
from the robots.txt making your server a greater target.  They are so
darned easy to guess that absent the file, they're gonna find their way
into those directories anyway.  Some other common and vulnerable
directories are /include, /inc, /admin, /Admin, /Servlet, /asp,
/script(s), /src, /data.

Malicious spiders are going to disobey what the file tells them to do.
However, spiders that play by "the rules" will obey and you'll get the
results you want from the file.  I say it's worth using.

However, make sure you've got the permissions on those directories and
their contents set properly.  It seems like templates and classes (if
not just static html and css) are something that should only be
accessible by your webapp, not world readable or served to a web browser.
If you're worried about people poking around in those directories, it
looks like you may be setting yourself up for a source disclosure attack
by making them serveable by your http daemon.

tack

_______________________________________________
Bits mailing list
[EMAIL PROTECTED]
http://www.sugoi.org/mailman/listinfo/bits

Reply via email to