Seems to have fixed itself. maybe google had a cached version?

On Thursday, 27 August 2015 11:28:17 UTC+1, Joseph Mohan wrote:
>
> Debug set to false. ( in local_settings.py and settings.py )
> Deleted all instances of robots.txt i can find
>
>    - in static/robots.txt 
>    - in nginx.conf
>
> I'm still using mezzanine3.1.10 
> never had any trouble before but i did set robots.txt to disallow all for 
> the first time with this site.
>
> Google webmaster tools just comes up with 16 errors "Sitemap contains 
> urls which are blocked by robots.txt."
>
> this is the site www.josephmohan.co.uk
>
> any ideas? 
>

-- 
You received this message because you are subscribed to the Google Groups 
"Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to