Hi all. I got into an interesting problem, before opening a few tickets I'd like to discuss a line of action. 1. Some sw (namely geomedia) produces shapefiles with very slight differences (same layer, modified and exported several times); I suppose this is due to truncation/approximation in coordinate precision. 2. As a result, the diff between these layers produces a number of virtually 0 width polygons, and some topological errors. 3. This in turn leads to 2 problems: 3.1. subsequent analyses fail because of the errors, and the user has no clue about the reasons 3.2. The microareas are tricky and time consuming to remove (if they are isolated, a possible solution is to select and remove those smaller than an arbitrary threshold, but if they are linked to some "real" polygons, this will not work). I would therefore suggest a few improvements over the toolchain, i.e.: * add a parameter in analyses, to either snap original data within a threshold before running the analysis, or to clean up the results afterwards * check & warn the user of topo errors in source layers, and of possible wrong results; maybe this should be optional, as this will slow down the analysis, and may be useless after first cleanup.
Sample data available for those interested. All the best. -- Paolo Cavallini - www.faunalia.eu QGIS & PostGIS courses: http://www.faunalia.eu/training.html _______________________________________________ Qgis-developer mailing list [email protected] http://lists.osgeo.org/mailman/listinfo/qgis-developer
