Shane Hathaway <[EMAIL PROTECTED]> writes:
> Looking at the logs, I once saw GoogleBot generate URLs like
> this to of 1000 characters or more.

I had this too, leading to heavy traffic what's more. Cf 

Since then, as a temporary measure I set the "no robots" meta header in
standard_wiki_header. Also, I cleaned up various link "holes" that would
lead to infinite urls, using page_url() and wiki_url(). Finally, all wiki
links were changed to use absolute urls (now optional). These things
should help, and in fact it should be safe to allow google into a modern

Back to the original poster: yes, to avoid getting hammered by search
engines it is necessary to make sure you expose no links leading to
infinite urls. This may not be as hard as you think. Hackers can make up
urls, but search engines don't (yet).


Zope-Dev maillist  -  [EMAIL PROTECTED]
**  No cross posts or HTML encoding!  **
(Related lists - )

Reply via email to