Another idea that has occured to me is to simply code the information to
be indexed in the robots.txt file.  Then, the robot could simply suck the
information out of the file and be done.

Example:

User-agent: Scooter
Interval: 30d
Disallow: /
Name: Fred's Site
Index: /index.html
Name: My Article
Index: /article/index.html
Name: My Article's FAQs
Index: /article/faq.html

    This would tell them to take this information to include in their search
database and move one.

    Other ideas?



                                                                Fred

_______________________________________________
Robots mailing list
[EMAIL PROTECTED]
http://www.mccmedia.com/mailman/listinfo/robots

Reply via email to