How to control Search Engine Spiders with webpy and robots.txt ? I suppose it is simple with static folder, but what about other dynamic's pages ? Did you experience this issue?
Regards, Boubou. --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "web.py" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/webpy?hl=en -~----------~----~----~----~------~----~------~--~---
