On Fri, Jul 15, 2011 at 03:07:26PM +0200, Alexander Burger wrote:
> (prinl "Disallow:"
> ((= *Host '`(chop "ticker.picolisp.com")) " /")
> ((= *Host '`(chop "picolisp.com")) " /21000/") ) )
This helped! Googlebot now seemed to have stopped all traversals.
I did this change to "robots.txt" yesterday at 14:58. After that, I can
find only four other accesses, the last one at 16:27.
In summary, I can't blame Google. It was actually my fault not to
explicitly disallow /21000/, because for a bot this looks like a
different site (it stopped other bots, though). Just disabling the
"root" of the traversal is not enough; there is no garbage collector