2016-06-21 14:40 GMT+02:00 Andy Mabbett <a...@pigsonthewing.org.uk>: > On 20 Jun 2016 5:31 pm, "Martin Koppenhoefer" <dieterdre...@gmail.com> > wrote: > > > I have just discovered another type of problem: > > > people adding full wikipedia urls into the website tag. In all cases > there was already a wikipedia tag present. > > This is precisely the sort of thing a bot could clean up, daily or weekly > say. >
actually it is not that simple. As we haven't only 1 method, but, for good reason, several, to store references to wikipedia, this bot would have to check whether the linked full url in the website is already covered by the wikipedia interlanguage links or not. This is not impossible, but also not completely trivial. This bot should also check whether the previous version had a different website value and restore this in case it makes sense, or flag it for human review. Cheers, Martin
_______________________________________________ talk mailing list talk@openstreetmap.org https://lists.openstreetmap.org/listinfo/talk