On Fri, Feb 17, 2017 at 11:11:12AM +0100, Christoph Hormann wrote:
> On Friday 17 February 2017, Andreas Vilén wrote:
> >
> > We are aware of the Corine import problems and have discussed them
> > locally at least in Sweden. Our community is very loose with not much
> > activity in mailing lists or other media, but so far consensus has
> > been not to remove Corine if it's not replaced by improved data.
> >
> > I have done some cleanup myself mostly around Kalmar, but in the huge
> > Northern parts of the country it seems unmanagable. Some users have
> > made good progress in the Bergslagen area though.
> 
> Well - the question you should probably ask yourselves is if this data 
> is of any help when you map in these areas.  I find it doubtful that it 
> is and areas like here:
> 
> http://www.openstreetmap.org/#map=13/56.7935/16.0168
> 
> support this impression.  IMO it would be a good idea to concentrate on 
> what you gain and not too much look at what you loose.
> 
> If there are worries in the local community about how to efficiently map 
> large wooded areas there are other methods that would be much better 
> suited.  Forests can be positively detected on multispectral imagery 
> with good reliability - in contrast to land cover classification data 
> sets like Corine which essentially only specify the least unlikely of a 
> fixed set of classes.  Producing a conservative data set this way (i.e. 
> one that only includes area which are clearly wooded), splitting this 
> into reasonably small chunks and providing this to mappers to avoid the 
> need for a lot of large scale tracing work seems a much more productive 
> way and much more compatible with normal manual mapping in OSM.

Lets not get this thread hijacked by theoretical ideas about how to
detect wooded areas. This thread is about broken multipolygons.

The issue at hand here is that there are a lot of broken multipolygons
out there. Some are probably from Corinne data. Now there are these
options:

a) Do nothing. Broken MPs will disappear once osm2pgsql switches.
b) Remove existing MPs and start from scratch.
c) Repair existing MPs.

From some of the posts in this thread, c) doesn't seem to be a good
option. Both a) and b) will result in those MPs disappearing from the
map first, before things get better. In the case of a) they will all
disappear one day, but the broken data is still there. In the case of b)
we can go through them and replace them by better data piece by piece.

I am willing to talk with the Swedish community (or any other) about
how best to approach this. I can generate special Maproulette challenges
for specific areas, in fact I think this is a better way then having
generic challenges for the whole world. If you work on such a challenge,
you might be thrown from an error in Borneo to one in Sweden to one in
Antarctica. And the problems are different everywhere. I'd rather have
more specific challenges addressing exactly one problem, for instance
"Broken multipolygons of certain types of landcover data imported from
Corinne in Sweden". That is something I can extract from OSM data and
that I can explain to people how to fix after consulting with local
mappers on how best to do this.

Jochen
-- 
Jochen Topf  joc...@remote.org  https://www.jochentopf.com/  +49-351-31778688

_______________________________________________
talk mailing list
talk@openstreetmap.org
https://lists.openstreetmap.org/listinfo/talk

Reply via email to