Hi

In proj.org/robots.txt there is currently this
User-agent: *

Disallow: # Allow everything

Sitemap: https://proj.org/sitemap.xml

while in gdal.org/robots.txt this
User-agent: *
Allow: /en/latest/
Disallow: /en/

That means that any robot is indexing aaaall the proj pages, including old
versions (I checked it in Google Console).

Should we do the same in PROJ as in GDAL?

Cheers,
Javier.
_______________________________________________
PROJ mailing list
PROJ@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/proj

Reply via email to