Another idea that has occured to me is to simply code the information to
be indexed in the robots.txt file. Then, the robot could simply suck the
information out of the file and be done.
Example:
User-agent: Scooter
Interval: 30d
Disallow: /
Name: Fred's Site
Index: /index.html
Name: My
Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On
Behalf Of Fred Atkinson
Sent: Sunday, January 11, 2004 4:38 PM
To: Robots
Subject: [Robots] Another approach
Another idea that has occured to me is to simply code the
information to be indexed in the robots.txt file
It was thus said that the Great Walter Underwood once stated:
--On Sunday, January 11, 2004 8:13 PM -0500 Sean 'Captain Napalm' Conner [EMAIL
PROTECTED] wrote:
And there you go. Using the different directives makes it backwards
compatible with the original robots.txt (where an older