yes, it's a good solution. I just test it.

Thanks.

On Sep 29, 9:31 am, Anand Chitipothu <[email protected]> wrote:
> On Sep 28, 6:04 pm, boubou_cs <[email protected]> wrote:
>
> > How to control Search Engine Spiders with webpy and robots.txt ?
> > I suppose it is simple with static folder, but what about other
> > dynamic's pages ?
> > Did you experience this issue?
>
> I generally add rewrite rule in lighttpd.
>
>     url.rewrite-once = (
>         "^/favicon.ico$" => "/static/favicon.ico",
>         "^/robots.txt" => "/static/robots.txt",
>         "^/static/(.*)$" => "/static/$1",
>         "^/(.*)$" => "/run.py/$1",
>     )
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"web.py" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [email protected]
For more options, visit this group at http://groups.google.com/group/webpy?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to