On Tue, Dec 7, 2010 at 7:53 PM, benmarillier <[email protected]> wrote: > > Hi All, > > I am having similar problems to Kaipi in regards to the v.lidar.growing > stage of LiDAR processing. I'm running GRASS 6.4.0 on Windows. My dataset is > a 1km by 1km tile of LiDAR with a point spacing of about 1.5 points/m2, so a > total of about 1.5 million in the area of interest. > > Thus far I've imported the points with v.in.ascii, split into the first > returns and last returns. I interpreted these two classes as follows (please > correct me if I'm wrong)... > > First returns: all first returns, including single returns where only one > return has been received. > Last returns: all single returns, and the last return where multiple returns > have been received. > > Then I removed outliers as per the micro-tutorial on the GRASS wiki. > > Next I ran v.lidar.edgedetection on the last returns with the default > parameters. > > This resulted in a reasonable looking classification of edge (2), not-edge > (1) and uncertain (3). The edges of the buildings and trees are quite well > defined, as shown in the image below, so far so good... > > ...however, when I run v.lidar.growing (default parameters), the output I > get is exactly the same as the edgedetection output. The points are > classified identically into 1, 2 and 3, and there appears to have been no > change in the classification. I've tried varying the growing parameters, and > changing the region resolution, but the output is always the same. > > I've trawled through the forums to no avail, so any advice would be greatly > appreciated. > > Thanks, > Ben
Hi. Is this urban, rural, or mixed land use? Mark _______________________________________________ grass-user mailing list [email protected] http://lists.osgeo.org/mailman/listinfo/grass-user
