On 12/26/2014 09:21 AM, Greg Sims wrote: > > I read the archives of this list and found a couple of entries. The answer > seems to be including "Disallows" in our robots.txt file. I did this as > you can see here -- towards the bottom of the file: > > http://www.raystedman.org/robots.txt > > This has been in place for four weeks so I do not believe it is working. I > believe this problem is there needs to be a robots.txt file for each > subdomain. In this case, mailman lives in the lists.raystedman.org > subdomain. Is there a way to have a robots.txt file for the > lists.raystedman.org subdomain?
Absolutely. Just put it in the root directory that contains the page which is served when you go to <http://lists.raystedman.org/>. I.e., put it where it will be served if you go to <http://lists.raystedman.org/robots.txt> However, note that can take a loooong time for pages to age out of search results. -- Mark Sapiro <[email protected]> The highway is for gamblers, San Francisco Bay Area, California better use your sense - B. Dylan ------------------------------------------------------ Mailman-Users mailing list [email protected] https://mail.python.org/mailman/listinfo/mailman-users Mailman FAQ: http://wiki.list.org/x/AgA3 Security Policy: http://wiki.list.org/x/QIA9 Searchable Archives: http://www.mail-archive.com/mailman-users%40python.org/ Unsubscribe: https://mail.python.org/mailman/options/mailman-users/archive%40jab.org
