On 16/10/17 05:24, Minh Nguyen wrote:
> When a Wikidata item is modified to link to a Wikipedia article (or
> Wikivoyage article etc.), the Wikipedia article automatically links back
> to the Wikidata item. This is a software feature made possible because
> Wikipedia and Wikidata are colocated in the same database cluster. No
> bots are involved; this is unlike the process by which interwiki links
> used to be maintained before Wikidata was introduced.
> 
> When a Wikipedia article is renamed, it does temporarily get detached
> from the Wikidata item because the task of updating the Wikidata item
> falls to a process that runs asynchronously on a job queue. It isn't
> possible for OpenStreetMap, as an external site, to automatically update
> its wikipedia tags via the same mechanism. However, in principle, one
> could write a bot that consumes Wikipedia's or Wikidata's recent changes
> feed, looking for features to update. I'm not personally proposing to
> run such a bot, to be clear. And one of the benefits of wikidata tags is
> that such a bot would decrease in necessity over time, since Wikidata
> QIDs are more stable.

Minh ... I can see that there is potential to use the Wikidata QID's as
a more stable path into wikipedia data. The way it is solving the
problem of accessing translated versions of wikipedia articles is
looking good, but I think it will be some time before it is totally
mastered? Some translations are completely different articles? The
problem I still see is that many of the items I am looking to link to
are elements of an article rather than the whole article, such as the
location of the works of a particular artist. At some point in the
future wikidata may well have a complete index of QID's for every
artist's work, but currently I don't have the time to add wikidata
entries where they don't exist, so a link to the artists wikipedia
article which may or may not actually list this particular work is
second best and in many cases there is not even an english version :(
Some bot then modifying that link out of context is not helpful and
while the idea of 'nobot' flags may seem a solution, it's just adding
another layer of complexity which potentially needs to exist for EVERY
tag on EVERY object. Something I don't think should be allowed!

This is just an example but there are hundreds of areas where the object
being identified is part of one or more wikipedia article in one or more
language, so the use of this data in the OSM dataspace needs to be
managed by OSM and only a small part can be automated using the feeds
from the wikidata bot's? Even if it is OSM bots that are doing the
processing. The annoying thing here is that the hierarchy of places that
wikidata provides could be useful to OSM searches ... but it still needs
the likes of Nominatim and/or GeoNames to cross reference that data
which provides an alternate secondary database.

-- 
Lester Caine - G8HFL
-----------------------------
Contact - http://lsces.co.uk/wiki/?page=contact
L.S.Caine Electronic Services - http://lsces.co.uk
EnquirySolve - http://enquirysolve.com/
Model Engineers Digital Workshop - http://medw.co.uk
Rainbow Digital Media - http://rainbowdigitalmedia.co.uk

_______________________________________________
talk mailing list
talk@openstreetmap.org
https://lists.openstreetmap.org/listinfo/talk

Reply via email to