Yes, when using the parameter-based system, the default router already maps 
/myapp/static/robots.txt to /robots.txt, so you don't need to specify 
anything.

Anthony

On Monday, May 6, 2013 2:25:37 PM UTC-4, Andriy wrote:
>
> On Monday, May 6, 2013 8:53:32 PM UTC+3, Anthony wrote:
>>
>> Anyway, are you saying that http://www.yourdomain.com/robots.txt does 
>> show the robots.txt file in the browser? If so, then I believe it should 
>> work. However, I'm not sure Google will check for it/update it on every 
>> single request -- you may have to wait a bit for it to take effect.
>>
>
> Yes, http://www.mydomain.com/robots.txt shows in browser.
>
> I now changed routes.py to only include Parameter-based system code:
> routers = dict(
> BASE = dict(default_application='myapp'),
> )
>
> http://www.mydomain.com/robots.txt  remains accessible, so I don`t think 
> I need to add anything extra to Parameter-based system for it to work. 
> Still getting sessions from google-bot. I will wait more time and monitor 
> this.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to