Hi!

> [1] Is the service protected against internet crawlers that find such
> links in the online logs of this email list? It would be a pity if we
> would have to answer this query tens of thousands of times for many
> years to come just to please some spiders who have no use for the result.

That's a very good point. We currently do not have robots.txt file on
the service. We should have it. I'll fix it ASAP.

GUI links do not run the query until click, so they are safe from bots
anyway. But direct links to sparql endpoint do run the query (it's the
API after all :) So robots.txt is needed there.

-- 
Stas Malyshev
smalys...@wikimedia.org

_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata

Reply via email to