Hi, I am exploring ways to make more educational maps in Wikipedia. For
example, this graph shows all US state governors. It works by querying
Wikidata for the governors' info, and drawing state overlays using OSM
relations tagged with the Wikidata IDs.
https://www.mediawiki.org/wiki/Help:Extensio
o help fix - win-win for everyone :)
On Fri, Nov 25, 2016 at 6:24 PM Martin Koppenhoefer
wrote:
sent from a phone
> Il giorno 25 nov 2016, alle ore 22:55, Yuri Astrakhan <
yuriastrak...@gmail.com> ha scritto:
>
> . I am simply converting existing Wikipedia tag into the Wikidata t
I have been steadily cleaning up some (many) broken Wikipedia and Wikidata
tags, and would like to solicit some help :)
To my knowledge, there is no site where one could add a set of OSM IDs that
need attention (something like a bug tracker lite, where one could come and
randomly pick a few IDs to
* I think MapRoulette is actually the tool we should use to fix these
issues. I am not yet sure how to build an OT query that gets relations for
the challenge, but this approach should automate the whole process. Any
ideas?
https://github.com/maproulette/maproulette2/issues/259
* Wikidata tags are
Russian Wikipedia just replaced all of their map links in the upper right
corner (geohack) with the Kartographer extension! Moreover, when
clicking the link, it also shows the location outline, if that object
exists in OpenStreetMap with a corresponding Wikidata ID (ways and
relations only, no no
, but I got always just a marker on the map, but not an
> >> outline, i.e. a line indicating the outer contours or boundaries of an
> >> object or figure. Perhaps, it is a thin line, and I do not notice it on
> >> the map? Or I misunderstood something.
> >>
> >&
Dave, I'm not sure what you mean.
On Mon, Jan 23, 2017 at 12:52 PM Dave F wrote:
Could the attribute be put on one line so 'Openstreetmap' is visible?
DaveF.
On 21/01/2017 01:40, Yuri Astrakhan wrote:
Russian Wikipedia just replaced all of their map links in the upper right
TLDR: researching ways to validate wikipedia and wikidata tags, wrote a
script to cross-check OSM and Wikidata, found many incorrect disambig
references, would love to start community discussion on best guidelines
going forward.
I have been analyzing the quality of OSM's wikipedia and wikidata ta
ed with these new tools can do a lot of useful
> work at a location, though it may take some time until we learn how to
> employ these tools effectively.
>
> [1] http://www.openstreetmap.org/node/4665613556#map=19/46.95039/7.42234
> [2] https://en.wikipedia.org/wiki/Mikhail_Bakunin
>
Does anyone know of an open source language map - basically a set of
geoshapes with the corresponding language code? Country boundaries are not
needed - e.g. Canada and USA would be English with the exception of French
for Montreal area.
This is needed to guesstimate what language the "name" tag
10, 2017 at 7:55 PM James wrote:
> Also have you checked:
> https://en.wikipedia.org/wiki/List_of_official_languages_by_country_and_territory
>
> On Apr 10, 2017 7:50 PM, "James" wrote:
>
> More like French for the entirety of the province of Quebec
>
> On Apr
New brunswick...with small
> patches of french throughout the rest
>
> On Apr 10, 2017 8:12 PM, "Yuri Astrakhan" wrote:
>
> James, thanks, but I was hoping for the language regions shapefile, e.g.
> in the GeoJSON form. The list of official languages will require a
nglish. There is a suburb called Orléans in which is
pretty much "the french part of town" as most street signs will be in
french, but rest of Ottawa is pretty English(in terms of street signs)
So generilizing wont help you much...
On Apr 10, 2017 8:27 PM, "Yuri Astrakhan" wrote:
TLDR: A SPARQL (rdf) database with both OSM and Wikidata data is up for
testing. Allows massive cross-referenced queries between two datasets. The
service is a test, and needs a permanent home to stay alive.
Overpass Turbo is awesome, but sadly it does not have data from Wikidata,
nor does it sup
The service is back up, this time with all the objects that have tags.
Also, I added the "has" properties on a relation - indicating all objects
contained within the relation. So now you can ask for a relation, that
contains a way, and both the relation and the way have the same wikidata ID
(somet
P.S. I am trying to get OSM updater to work, so that OSM data is always up
to date, but pyosmium is giving me some trouble. Please email if you know
the answer to
https://stackoverflow.com/questions/44170360/callbacks-not-called-in-pyosmiums-diff-downloader
On Wed, May 24, 2017 at 11:50 PM Yuri
Thu, May 25, 2017 at 3:06 AM mmd wrote:
> Hi,
>
> Am 25.05.2017 um 08:50 schrieb Yuri Astrakhan:
> > The service is back up, this time with all the objects that have tags.
> > Also, I added the "has" properties on a relation - indicating all
> > objects contained
The RDF/SPARQL database that has both OpenStreetMap and Wikidata data in
the same table is alive and well, and getting considerable usage. To make
it better understood by even wider community, I made an intro video with
some examples. This database mostly benefits the object tag validation and
res
; anywhere in the Query Service and is not included as metadata in the
> returned dataset.
>
> If this is not corrected, this Query Service will unfortunately be listed
> under the following page:
> https://wiki.openstreetmap.org/wiki/Lacking_proper_attribution
>
>
>
> On
> https://wiki.openstreetmap.org/wiki/Lacking_proper_attribution
>>
>>
>>
>> On Wed, Jun 7, 2017 at 12:07 PM, Yuri Astrakhan
>> wrote:
>>
>>> The RDF/SPARQL database that has both OpenStreetMap and Wikidata data in
>>> the same table is a
Thu, Jun 15, 2017 at 6:42 PM, Andy Mabbett
wrote:
> On 15 June 2017 at 21:02, Yuri Astrakhan wrote:
>
> > This service is still looking for a proper home. If you have an extra
> 700GB
> > of space on a server, please PM.
>
> Perhaps the WMF toolserver?
>
> -
The combined SPARQL database of OSM and Wikidata has been updated:
* There is a short video explaining the basics (at the top of [1])
* new Wikidata interface
* now all OSM "wikipedia" tags and sitelinks in Wikidata are stored the
same way, so it is possible to cross-check when "wikidata" and "wiki
m 13.08.2017 um 19:49 schrieb Yuri Astrakhan:
>
> > * all ways now store "osmm:loc" with centroid coordinates, making it
> > possible to crudely filter ways by location
>
> out of curiosity, can you say a few words on how your overall approach
> to calculate centro
Wikidata's engine (Blazegraph + customizations). If not, I might need to do
some relations postprocessing, as well as automatic updating.
On Mon, Aug 14, 2017 at 1:39 PM, François Lacombe wrote:
> Hi
>
> 2017-08-14 11:18 GMT+02:00 mmd :
>
>> Hi,
>>
>> Am 13.08
rwise:
sparse_mem_array
dense_mmap_array
sparse_file_array,my_cache_file
dense_file_array,my_cache_file
Thanks!
On Mon, Aug 14, 2017 at 4:31 PM, Sarah Hoffmann wrote:
> On Mon, Aug 14, 2017 at 11:10:39AM -0400, Yuri Astrakhan wrote:
> > mmd, the centroids are calculated with this cod
:08:03PM -0400, Yuri Astrakhan wrote:
> > Sarah, how would I set the node cache file to the repserv.apply_diffs()?
> > The idx param is passed to the apply_file() - for the initial PBF dump
> > parsing, but I don't see any place to pass it for the subsequent diff
> > pro
OSM+WD service updates: new examples interface contains just the OSM-related
examples, and they are user-contributable. The osmm:loc (centroid) is now
stored with all objects including relations, so it is now easy to see how
far Wikidata's coordinates are from OSM's - http://tinyurl.com/yd97qtp2 A
Hello mappers! Could anyone help with over 800 OSM objects that are
pointing to Wikipedia disambiguation pages? Especially Polish community -
588. You can get the currently broken ones by running the query below (you
may want to modify it a bit to list more relevant objects). If you feel
brave, t
Fabrizio, the easiest way to fix both "wikidata" and "wikipedia" tags is to
use iD editor's "wikipedia" field (at the top), but not the "tag" field
(list at the bottom). This way, selecting Wikipedia value from the
dropdown automatically corrects wikidata as well. If using JOSM, use
Wikipedia pl
i wrote:
> Hi Yuri,
>
> Thanks for offering the tool for checking.
> I corrected 5 of "ja" ones.
>
> Shu Higashi
>
> 2017-09-10 2:58 GMT+09:00, Yuri Astrakhan :
> > Fabrizio, the easiest way to fix both "wikidata" and "wikipedia" tags is
&g
Id osmt:wikidata ?wd ;
> osmt:wikipedia ?wpTag .
>
> ?wd wdt:P31/wdt:P279* wd:Q4167410 .
>
> FILTER( STRSTARTS(STR(?wpTag), 'https://fr.wikipedia'))
> SERVICE wikibase:label { bd:serviceParam wikibase:language "fr" . }
> }
> LIMIT 100
>
Now all disambig-broken points on a map. Click the point to fix it.
http://tinyurl.com/ya6htp9f
On Sun, Sep 10, 2017 at 4:04 AM, Yuri Astrakhan
wrote:
> Thanks! Worry not, I just added more for fixing, by extracting them from
> Wikipedia tag using the "fetch wikidata" JOSM
I just wrote a simple query to help finding all wikidata objects next to a
given OSM object. Also, it is far better to delete a bad
Wikipedia/Wikidata tags than to keep the incorrect ones. Thanks for all
the help!!!
https://wiki.openstreetmap.org/wiki/SPARQL_examples#Find_all_wikidata_items_near
The new service is getting more and more usage, but it lacks the most
important thing - a good name. So far my two choices are:
* wikosm
* wikidosm
Suggestions? Votes? The service combines Wikidata and OpenStreetMap
databases, and uses SPARQL (query language) to search it, so might be good
to
elate to the new draft trademark policy?
>>
>> I can't tell from the draft policy, but I believe that OSM at least is
>> a protected mark, not sure about osm.
>>
>> But I do think Simone Poole asked the community to stop naming things
>> with osm trademarks in them
before fixing the first list)
On Sun, Sep 17, 2017 at 5:45 PM, Yuri Astrakhan
wrote:
> One thing we should consider is the domain name. I doubt we can afford
> woq.com :)
>
> These names were proposed
> woq 2
> wdoqs
> wdosm
> woqs
> q936
>
> And these proposed
yMapData or SparklyDataMap
>
> 2017-09-17 23:46 GMT+02:00 Yuri Astrakhan :
>
>> One thing we should consider is the domain name. I doubt we can afford
>> woq.com :)
>>
>> These names were proposed
>> woq 2
>> wdoqs
>> woqs
>> q936
>>
&
The "not yet fully named" service is now accessible directly from JOSM -
just like OT. Simply install or update Wikipedia plugin, and it will show
up in the download data screen (expert mode).
Documentation:
https://wiki.openstreetmap.org/wiki/Wikidata%2BOSM_SPARQL_query_service#Using_from_JOSM
_
ated list of wikidata
> items in for example subject:wikidata? A statue with more than one person
> in it, for example?
>
> Polyglot
>
> 2017-09-18 7:28 GMT+02:00 Yuri Astrakhan :
>
>> The "not yet fully named" service is now accessible directly from JOSM -
>> j
Hi, I would still highly advise putting it into git, because
* it's easier to discover by others, code search, etc
* it is far easier to propose changes, discuss them, track who submitted
what, etc
* it is easier to fork to try different things, and for others to see your
forks and possibly adapt t
There is now a relatively small number of OSM nodes and relations
remaining, that have wikipedia, but do not have wikidata tags. iD editor
already automatically adds wikidata to all new edits, so finishing up the
rest automatically seems like a good thing to do, as that will allow many
new quality
>
>
> > This way, we will be able to quickly find all the objects that are
> > problematic with the Wikidata+OSM service. For example, thanks to the
> > community, we already fixed over 600 incorrect links to wiki
> disambiguations
> > pages, and this will find many more of them. We will be able t
Such an awesome discussion, thanks!
* https://www.wikidata.org/wiki/Special:GoToLinkedPage can already be used
to open a Wikipedia page when you only have a Wikidata ID. It even accepts
a list of wiki sites. For example, this link automatically opens the wiki
page for Q3669 in the first available
>
> What will inevitably happen if you automatically add wikidata tags is
> that existing errors in either OSM (in form of incorrect wikipedia
> tags) or in wikidata (in form of incorrect connections to wikipedia
> articles) will get duplicated.
>
Christoph, a valid point. Yet the duplicate would
Tobias, agree 100%, thanks.
On Wed, Sep 20, 2017 at 12:14 PM, Tobias Knerr wrote:
> On 20.09.2017 17:02, Christoph Hormann wrote:
> > It is best to regard the wikidata and wikipedia tags in OSM as 'related
> > features' rather than identical objects.
>
> We shouldn't dilute the definition of the
and how to reduce the overlap. Keeping
duplicates in sync is always harder than to let the tools do their data
merging work if needed.
On Wed, Sep 20, 2017 at 12:18 PM, Yuri Astrakhan
wrote:
> Tobias, agree 100%, thanks.
>
> On Wed, Sep 20, 2017 at 12:14 PM, Tobias Knerr
> wrote:
&
>
> Don't assume such cases are just a freak anomaly - they are not. OSM
> and wikidata are two very different projects which developed in very
> different contexts. Just another example: For most cities and larger
> towns (at least in Germany) there exists an admin_level 6/8 unit with
> the same
Also, there is a general country subdivision project with plenty of
information and current status. I'm pretty sure OSM community has a lot of
good info to share:
https://www.wikidata.org/wiki/Wikidata:WikiProject_Country_subdivision
On Wed, Sep 20, 2017 at 1:28 PM, Yuri Astrakhan
st, possibly bigger than OSM itself?
>
>
> That is nice for MB, but problematic in more than one way for OSM.
>
Please elaborate, I know of at least one more company that is actively
doing that. Sigh, another side topic :D
On Wed, Sep 20, 2017 at 1:58 PM, Simon Poole wrote:
> [turni
>
> people fixing WD won’t necessarily check if their fixes work well with
> OSM. Maybe we should include versions in our WD tags?
> I’ve seen OSM objects linked from WD, are there people monitoring changes
> to linked objects?
>
Yes, that's what the Wikidata+OSM service is for. It allows community
>
> And vice versa: I always wonder how usable a map in Latin alphabet is for
> Chinese or Russian speakers.
Cannot speak for Chinese, but in Russia, Latin alphabet was taught at the
very early age in school. I think that drawing a map with local names in
Latin font should not cause too many prob
Since this thread had not received any new discussion in the past 4 days, I
assumed all points were answered and proceeded as planned, per mechanical
edit policy. Yet, after I have added all the nodes and moved on to
relations, I have been blocked by Andy Townsend with the following message.
I beli
At the moment, there are nearly 40,000 OSM objects whose wikipedia tag does
not match their wikidata tag. Most of them are Wikipedia redirects, whose
target is the right wikipedia article. If we are not ready to abandon
wikipedia tags just yet (I don't think we should ATM), I think we should
fix th
pedia.
On Mon, Sep 25, 2017 at 11:18 PM, Marc Gemis wrote:
> or via Osmose ?
>
> On Tue, Sep 26, 2017 at 5:16 AM, Marc Gemis wrote:
> > what about a Maproulette task ?
> >
> > On Tue, Sep 26, 2017 at 5:11 AM, Yuri Astrakhan
> wrote:
> >> At the moment, there a
Marc, thanks. I was under the assumption that talk is the global community
- as it is the most generic in the list, unlike talk-us and
talk-us-newyork. Does it meany that any global proposal would require
talking to hundreds of communities independently, making it impossible to
coordinate, because
some form of ownership and maintain the data and that is more
> beneficial in the long term than an automated quick fix now.
>
> m.
>
> On Tue, Sep 26, 2017 at 5:53 AM, Yuri Astrakhan
> wrote:
> > According to Martijn (of MapRoulette fame), there is no way a challenge
> c
, Mark Wagner wrote:
> On Mon, 25 Sep 2017 23:11:52 -0400
> Yuri Astrakhan wrote:
>
> > At the moment, there are nearly 40,000 OSM objects whose wikipedia
> > tag does not match their wikidata tag. Most of them are Wikipedia
> > redirects, whose target is the right
carce resource we could get.
On Tue, Sep 26, 2017 at 3:48 AM, Sarah Hoffmann wrote:
> On Mon, Sep 25, 2017 at 11:53:03PM -0400, Yuri Astrakhan wrote:
> > According to Martijn (of MapRoulette fame), there is no way a challenge
> can
> > link to object IDs. MapRoulette can only highli
>
> > p.s. OSM is a community project, not a programmers project, it's about
> > people, not software :-)
>
> It's both. OSM is first and foremost is a community, but the result of
our effort is a machine-readable database. We are not creating an
encyclopedia that will be casually flipped through
ternal IDs. Not human readable, they cannot be entered 'by hand' nor
> verified on the ground.
> Once you accept them in OSM, you can't really complain about bots.
>
> Yves (who still think such UIDs are only needed for the lack of good query
> tools).
>
>
&g
Here is a query that finds all wikidata IDs frequently used in
"brand:wikidata", and shows OSM objects whose "wikidata" points to the
same. I would like to replace all such wikidata/wikipedia tags with the
corresponding brand:wikidata/brand:wikipedia. Most of them are in India,
but there are some
I think we should re-start with the definition of the problems we are
(hopefully) trying to solve, or else we might end up too far in the
existential realm, which is fun to discuss, but should be left for another
thread.
* Problem #1: In my analysis of OSM data, wikipedia tags quickly go stale
be
Lester, first and foremost, Wikidata is a system to connect the same
Wikipedia articles in different languages. The "read this article in
another language" links on the left side comes from Wikidata. Wikidata has
developed beyond this initial goal, but it remains the only way to identify
Wikipedia
explains what the
> tag is? Really? This is not a joke?
>
> OSM is sick, please somebody call a doctor.
> Yves
>
>
On Wed, Sep 27, 2017 at 12:14 PM, Yuri Astrakhan
wrote:
> I think we should re-start with the definition of the problems we are
> (hopefully) trying to solve
Marc, I think you are confusing the goal and the means to get there. I
agree - the goal is to be able to globally find all Wendy's, so that when I
travel, I still can search for familiar brands. So the same brand should
have the same ID everywhere. That ID can be either textual or numeric.
Both
>
> That formed no part of the early discussions on how wikidata should
> work? I bowed out when the discussions were going down a path I did not
> find to be at all useful. The current offering is certainly a lot more
> 'organised' than those original discussions.
Getting the initial points acros
Martin, you cannot make a general claim based on a single value. Users can
enter "Aldi", or "Aldi Nord" or "Aldi Sud". With different capitalization
and dashes, and with or without dots, and god knows what other creative
ways to misspell it. Specifying Q125054 is the same as specifying "Aldi".
If
That's exactly what we are trying to do. Add another tag --
brand:wikidata=Q550258
On Wed, Sep 27, 2017 at 4:10 PM, yvecai wrote:
> Excuse me, but what does wikidata do in this discussion ?
> If brand=wendy is different tham brand=wendy, and if somebody has a
> problem with is it, why not chang
>
> > Specifying Q125054 is the same as specifying "Aldi". If needed/wanted,
> it could be replaced with the more specific wikidata entry like Aldi Nord.
>
> no, it’s not the same, because this wikidata object suggests that there is
> one company, Aldi GmbH & Co. KG, with 2 seats, and one logo.
> S
I have been fixing nodes that have wikipedia but no wikidata tags [1], and
even the first two randomly picked nodes had identical problem - article
was renamed (twice!) without leaving redirects - node 1136510320
Try it yourself - run the query and see what the it points to.
[1]
https://wiki.open
Verifiability is critical to OSM success, but it does not mean it must only
be verifiable by visiting the physical location. Tags like "wikipedia",
"wikidata", "url", "website" and some IDs cannot be verified that way. You
must visit some external website to validate. Stopping by Yellowstone
Nati
eed a
good way to do it.
Linking to Wikipedia with the page titles is bad. It is not stable.
Wikidata tags fixes that. No other claim is being made here.
On Sun, Oct 1, 2017 at 5:06 AM, Christoph Hormann wrote:
> On Sunday 01 October 2017, Yuri Astrakhan wrote:
> >
> > Wikipedia
On Sun, Oct 1, 2017 at 11:12 AM, Tomas Straupis
wrote:
> I guess the point is that:
> 1. Its ok to play with some pet-tag like wikidata
>
100 % agree
> 2. Its not a WORK to automatically update one osm tag according to another
> osm tag (anybody can do it online/locally/etc). It adds NO value.
On Sun, Oct 1, 2017 at 1:29 PM, Tomas Straupis
wrote:
> 2017-10-01 20:04 GMT+03:00 Yuri Astrakhan:
> >> 2. Its not a WORK to automatically update one osm tag according to
> another
> >> osm tag (anybody can do it online/locally/etc). It adds NO value.
> >
On Sun, Oct 1, 2017 at 8:15 PM, john whelan wrote:
> Since an OSM object has lat and long value and it appears that wiki
> whatever also has one the entries can be linked.
>
Not so. The data is very often different between wikipedia, wikidata, and
OSM. Also, the same location could be a square
John, I guess it is always good to talk as a data scientist - with numbers
and facts. Here's why matching by coordinates would not work. This query
calculates the distance between the OSM nodes, and the coordinates that
Wikidata has for those nodes. I only looked at nodes, because ways and
relatio
On Sun, Oct 1, 2017 at 3:45 PM, Tomas Straupis
wrote:
> > Tomas, you claimed that "It adds NO value." This is demonstrably wrong.
> You
> > are right that the same fixing was done for years. But until wikidata
> tag,
> > there was no easy way to FIND them.
>
> There always was.
> You simply
>
>
> I will repeat that this is not something which COULD be done, this
> comparison is something, what IS ACTUALLY DONE and has been done for
> years.
Tomas, this is what I understand from what you are saying:
* You download a geotagging wikidata dump and generate a table with
latitude, longi
/10/2017 02:56, Paul Norman wrote:
>
>> On 10/1/2017 5:39 PM, Yuri Astrakhan wrote:
>>
>>> Lastly, if the coordinates are different, you may not copy it from OSM
>>> to Wikidata because of the difference in the license.
>>>
>>
>> Just for cl
Martin, while it is fascinating to learn about Aldi, its history, and
possible ways to organize information about it, isn't it a moot point for
our discussion? We are talking about Wikipedia, and how we link to it.
There is only one Aldi Wikipedia article that can be connected to:
* German
https
While I have nothing against pausing bulk wikidata additions for a month,
we should be very clear here:
* several communities use bots to maintain and inject these tags, e.g.
Israel. Should they pause their bots?
* If a specific community is ok with it, does it override world wide ban
for that loca
ledge,
these are the main usecases by our data consumers.
On Tue, Oct 3, 2017 at 3:25 AM Richard Fairhurst
wrote:
> Yuri Astrakhan wrote:
> OpenStreetMap takes and has always taken a whiter-than-white view of
> copyright. We aim to provide a dataset that anyone can use without fear of
> le
I like the "bot=no" flag, or a more specific one for a given field -
"name:en:bot=no" - as long as those flags are not added by a bot :)
Would it make sense, judging how wikidata* tags have been mostly auto-added
by iD, as well as user's bot efforts, including my own, to treat wikidata
explicitly
Speaking from my Wikipedia bot experience (I wrote bots and created
Wikipedia API over 10 years ago to help bots):
Bots were successful in Wikipedia because all users felt empowered. Users
could very easily see what the bot edited, fix or undo bot edits, and
easily communicate with the bot authors
I would like to introduce a new quick-fix editing service. It allows users
to generate a list of editing suggestions using a query, review each
suggestion one by one, and click "Save" on each change if they think it's a
good edit.
For example, RU community wants to convert amenity=sanatorium ->
Simon, thanks for the constructive criticism, as it allows improvements
rather than aggravation. I propose that "rejections" are saved as a new
tag, for example "_autoreject". In a way, this is very similar to the
"nobot" proposal - users reject a specific bot by hand.
_autoreject will store a se
o 0.2, and added the ability to do reject tag (as described in my
prev email).
Thanks!
On Sat, Oct 14, 2017 at 3:45 AM Michael Reichert
wrote:
> Hi Yuri,
>
> Am 2017-10-13 um 23:25 schrieb Yuri Astrakhan:
> > I would like to introduce a new quick-fix editing service. It allows
>
ctTag and #queryId values must consist of only the Latin
characters, digits, and underscores.
Additionally, the tool no longer allows editing above zoom 16.
Thanks!
On Sat, Oct 14, 2017 at 12:34 AM Yuri Astrakhan
wrote:
> Simon, thanks for the constructive criticism, as it allows improvements
tations.
On Sat, Oct 14, 2017 at 6:09 AM Christoph Hormann wrote:
> On Friday 13 October 2017, Yuri Astrakhan wrote:
> > I would like to introduce a new quick-fix editing service. It allows
> > users to generate a list of editing suggestions using a query, review
> > each s
Jochen, not exactly. I was following up on the Christoph Hormann's idea of
the "bot=no" tag, to "allow mappers to opt out of bot edits on a
case-by-case basis." Since every query is essentially a separate,
human-operated bot, it seemed appropriate to use it for this case, in a
form of nobot=botID
productive and beneficial to everyone involved.
On Sun, Oct 15, 2017 at 5:39 AM, Christoph Hormann wrote:
> On Sunday 15 October 2017, Yuri Astrakhan wrote:
> > [...] I was following up on the Christoph Hormann's
> > idea of the "bot=no" tag, to "allow mappers to op
sons for that position, and explain why you
think it is incorrect. Perhaps we should learn from the high school debate
class? Sorry for the long email.
On Sun, Oct 15, 2017 at 6:38 AM, ajt1...@gmail.com
wrote:
> On 15/10/2017 11:04, Christoph Hormann wrote:
>
>> On Sunday 15 October
lly not taking into account the wishes of the local communities
> when expressed.
>
> Many Thanks
>
> Cheerio John
>
>
>
>
>
> On 15 October 2017 at 08:04, Yuri Astrakhan
> wrote:
>
>> Andu, with all due respect, you are misrepresenting things.
If a community has had a well established and agreed process running, which
does not create any new data issues, why should someone outside of that
community be requesting a global halt? It's not like the data is getting
worse all of a sudden, right? And their work does not prevent global
communit
ccord with the MapRoulette team into their
> tool (or Osmose for that matter). It's all open source.
>
> That feature could look like that the creator of a MapRoulette challenge
> may optionally provide a range of possible (typical) answer options
> ("quick fixes") which are then s
Tobias, as promised, a thorough response.
On Sun, Oct 15, 2017 at 9:14 AM, Tobias Zwick wrote:
>
> So, the initial question is: What is the conceptual use case for such a
> tool? Where would be its place in the range of available OSM tools?
>
I think my main target is the JOSM validator's "fix
Lester, the naming of this service is still a work in progress, and might
have confused a few people. My apologies for that. I do plan to create a
proper name, logo, domain name, and SSL certificate once I have some spare
time. If anyone wants to take care of that, your help is appreciated.
The
ts
>>> policy. A resposible developer of such a tool should inform its users
>>> that making automated edits comes with certain requirements and that
>>> not following these rules can result in changes being reverted and user
>>> accounts being blocked.
>&g
Rory, thanks, and that's why I think it is a bad idea to do bot edits
without first running it through my tool. If we do a mass edit, we have to
go through a very lengthy community consensus study, which might still miss
things. Then the bot developer might still make an error that is not likely
t
wiki will solve 80% of the
problems.
P.S. You can star any wiki page, and it will email you when the page
changes. Just like a forum.
On Mon, Oct 16, 2017 at 8:42 AM, Rory McCann wrote:
> On 16/10/17 14:02, Yuri Astrakhan wrote:
>
>> Rory, thanks, and that's why I think it is a
1 - 100 of 228 matches
Mail list logo