When I need to work with polygons (especially for testing purposes, where the same procedure needs to take place several times) I usually create a raster with polygonsToRaster and use this new raster (made from a polygon) to mask subsequent rasters. I found this to be faster than the former.
HTH, Roman On Thu, Sep 2, 2010 at 12:35 AM, Robert J. Hijmans <r.hijm...@gmail.com>wrote: > Jorge, > > You could probably test using a subset of your raster. E.g. do: > > plot(raster_map) > e = drawExtent() # now draw a smallish rectangle on the map > raster_map <- crop(raster_map, e) > > etc. > > I would use polygonValues if there were relatively few, or only very > small, polygons. From your description it appears that your polygons > cover a large part of the extent of your raster and I would go with > polygonsToRaster followed by zonal. Perhaps something along these > lines: > > library(raster) > # first setting up example polygons and raster > nc <- readShapePoly(system.file("shapes/sids.shp", > package="maptools")[1], proj4string=CRS("+proj=longlat +datum=NAD27")) > r = raster(nc) > res(r) = 0.1 > v <- runif(ncell(r)) > v[v<0.2] <- NA > r[] <- round(v) > > > p <- polygonsToRaster(nc, r) > # table with counts of raster values for all polygons (but not for NA > cells) > a = crosstab(p, r) > # counts of cells for all polygons > b = freq(p) > > > > An example of the performance of the polygonsToRaster vs. polygonValues > > > nc <- readShapePoly(system.file("shapes/sids.shp", > package="maptools")[1], proj4string=CRS("+proj=longlat +datum=NAD27")) > > r = raster(nc) > > res(r) = 0.1 > > r[] = round(runif(ncell(r))) > > system.time( polygonsToRaster(nc, r) ) > Found 100 region(s) and 108 polygon(s) > user system elapsed > 4.91 0.00 4.92 > > system.time( polygonValues(nc, r) ) > user system elapsed > 40.69 0.09 40.78 > > > > > Robert > > > On Wed, Sep 1, 2010 at 3:03 PM, Jorge Fernando Saraiva de Menezes > <jorgefernandosara...@gmail.com> wrote: > > Dear list, > > > > I have a raster, with a grid resolution of 250m2 and values of 0,1 or NA, > > and a shapefile with about 5000 polygons of areas varying from 221m2 to > > 323,629,571,955m2, which means that a polygon might have 1 billion cells > > (fortunaly, these are very few) > > > > I would like to obtain for each polygon the percentage of 0, 1 and NAs > cells > > per island... I thought the following could work: > > > > polygonValues method > > > > poly_map=readShapePoly("shapefile path") > > raster_map=raster("raster path") > > objective=polygonValues(poly_map,srtm) > > > > zonal method > > > > poly_map=readShapePoly("shapefile path") > > raster_map=raster("raster path") > > rasterofpoly=polygonsToRaster(poly_map) > > objective=zonal(raster_map,rasterofpoly,stat="sum",na.rm=T) > > > > polygonstoRaster method > > > > poly_map=readShapePoly("shapefile path") > > raster_map=raster("raster path") > > objective=polygonsToRaster(poly_map,raster_map, mask=T,filename="path to > new > > raster") > > > > > > but all of those take a long time to work... and i don't know a quick way > of > > testing it's efficiency, so I would like to know if anyone knows which of > > those would perform the calculation faster in this situation, and if any > of > > these methods are viable (i.e. would not take months or years) > > > > I have acess to a workstation with 4 processors (1 GHz each , i thnk) and > > 8Gb of memory (not sure about this configuration), using ubuntu > > > > Thanks in advance, > > Jorge Menezes > > > > [[alternative HTML version deleted]] > > > > _______________________________________________ > > R-sig-Geo mailing list > > R-sig-Geo@stat.math.ethz.ch > > https://stat.ethz.ch/mailman/listinfo/r-sig-geo > > > > _______________________________________________ > R-sig-Geo mailing list > R-sig-Geo@stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/r-sig-geo > -- In God we trust, all others bring data. [[alternative HTML version deleted]] _______________________________________________ R-sig-Geo mailing list R-sig-Geo@stat.math.ethz.ch https://stat.ethz.ch/mailman/listinfo/r-sig-geo