Thanks Etienne. On Tue, Jun 8, 2010 at 12:31 AM, Etienne Bellemare Racine <etienn...@gmail.com> wrote: > Michael, > > I have not gone through extensive testing, but it seems pretty fast and > usefull for my 1.0 las. I've loaded a 235 Mb file on a USB drive, and it ran > in ~20 seconds. SAGA, in comparison did it in 3 min 22 seconds. > > Cheers, > Etienne > > Le 2010-06-06 10:18, Michael Sumner a écrit : > > Hello, > > > To other issue I see here is that R is loading the whole file in memory, so > if you can manage small files, that might not be that easy with (standard) > larger ones. Don't you think ? > > > That was certainly true of the version of code I posted, but writing a > more flexible version is not difficult, and actually less difficult > than I expected. I've implemented arguments to "skip" and read "nrows" > at a time, so there is the beginnings of a wrapper around the core > read for building more flexibility. > (I was thinking of including subsetting of various kinds which really > makes it more complicated, and the appropriate level to handle that is > in a wrapper to this function). > I've updated the R source on my site, and here's a new example. This > should be considered as a rough working draft, the details can be > hidden in the final suite of functions. My chunk/rows handling is > pretty awkward, and may have bugs for particular record numbers. > Any testing you can provide would be greatly appreciated. > # new version with "skip" and "nrows" arguments > source("http://staff.acecrc.org.au/~mdsumner/las/readLAS.R") > f <- "lfile.las" > ## get just the header > hd <- readLAS(f, returnHeaderOnly = TRUE) > numrows <- hd$`Number of point records` > ## [1] 1922632 > ## read in chunks, and pass to DB or ff, or subset by sampling, etc.. > rowskip <- 0 > chunk <- 1e5 > rowsleft <- numrows > system.time({ > ## keep track of how many rows we skip, and how many are left > for (i in 1:ceiling(numrows / chunk)) { > if (rowsleft < chunk) chunk <- rowsleft > if (chunk < 1) break; > d <- readLAS(f, skip = rowskip, nrows = chunk) > rowskip <- rowskip + nn > rowsleft <- numrows - nn > } > }) > # user system elapsed > # 1.10 0.55 1.64 > On Sun, Jun 6, 2010 at 8:09 PM, Etienne Bellemare Racine > <etienn...@gmail.com> wrote: > > > This is interesting, I'll try your code on my lidar files in the next few > days. > 2010-06-04 22:36, Michael Sumner wrote : > > > Thanks Alex, I will eventually post this to a broader audience. > I've used liblas and lastools, but the aim here is for a pure R > implementation that is built directly from the LAS specification > without 3rd party tools. > > > What might be of interest in using liblas is that it provides support for > many las versions and they plan to provide support for some versions to come > (conditional to funding) so having an R binding might be of interest here. > They are also working on the integration of a spatial index which would > allow easier handling of large files. I must say I don't know how hard > writing a wrapper for R might be for that particular tool. > To other issue I see here is that R is loading the whole file in memory, so > if you can manage small files, that might not be that easy with (standard) > larger ones. Don't you think ? Did you give a try to the R SAGA package. > There is a module for loading las files but again, I don't know how it > manages memory. I guess that it could be possible to use some sort of ff > package to handle bigger files, but that's just on the top of my head. > Etienne > > > The R code already works quite well to extract x/y/z/time/intensity, > it just needs some extra work to tidy up and generalize things and > ensure that very big datasets can be read. > Cheers, Mike. > On Sat, Jun 5, 2010 at 6:07 AM, Alex Mandel<tech_...@wildintellect.com> > wrote: > > > On 06/03/2010 07:54 PM, Michael Sumner wrote: > > > Hello, > I'm looking for interest in this functionality and eventually working > it into a package. > I don't actually use LAS data much, and only have one example file so > I'm hoping others who do or know others who would be interested can > help. I have previously posted this to r-spatial-devel. > > > I think there are people who would use it. You might want to have a look > at http://liblas.org/ (some of the same people that do gdal/org work) > Wrapping this library might be a good approach. There are example files > available too. > Thanks, > Alex > _______________________________________________ > R-sig-Geo mailing list > R-sig-Geo@stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/r-sig-geo > > > _______________________________________________ > R-sig-Geo mailing list > R-sig-Geo@stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/r-sig-geo > > > _______________________________________________ > R-sig-Geo mailing list > R-sig-Geo@stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/r-sig-geo >
_______________________________________________ R-sig-Geo mailing list R-sig-Geo@stat.math.ethz.ch https://stat.ethz.ch/mailman/listinfo/r-sig-geo