On 18/09/13 00:10, Markus Neteler wrote:
Hi,

I came across this question:

http://gis.stackexchange.com/questions/71734/how-to-calculate-mean-coordinates-from-big-point-datasets

and wondered if this approach would be the fasted:

# http://grass.osgeo.org/sampledata/north_carolina/points.las
v.in.lidar input=points.las output=lidarpoints -o
...
Number of points: 1287775
...

Now I would use
v.univar -d lidarpoints type=point

This calculates geometry distances, not the mean coordinate:

$ v.univar -d myhosp type=point

number of primitives: 160
number of non zero distances: 12561
number of zero distances: 0
minimum: 9.16773
maximum: 760776
range: 760767
sum: 2.69047e+09
mean: 214193
mean of absolute values: 214193
population standard deviation: 128505
population variance: 1.65136e+10
population coefficient of variation: 0.599953
sample standard deviation: 128511
sample variance: 1.6515e+10
kurtosis: 0.277564
skewness: 0.801646



Is it the best way?

How about v.to.db -p op=coor and then calculating the mean of the coordinates with an ad-hoc script. But that's probably not any faster than Hamish' v.out.ascii approach in v.points.cog.

Would probably be a nice little module to have in C: calculate centroid of polygon, center point of line and centroid (possibly weighted) for points.

Moritz

_______________________________________________
grass-dev mailing list
[email protected]
http://lists.osgeo.org/mailman/listinfo/grass-dev

Reply via email to