Re: [R-sig-Geo] install rgdal from source macOS

2019-06-20 Thread Dominik Schneider
6/Resources/library/sp/include"
-isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk
-I/usr/local/include  -fPIC  -Wall -g -O2  -c projectit.cpp -o projectit.o
clang++ -std=gnu++11 -dynamiclib -Wl,-headerpad_max_install_names
-undefined dynamic_lookup -single_module -multiply_defined suppress
-L/Library/Frameworks/R.framework/Resources/lib -L/usr/local/lib -o
rgdal.so OGR_write.o gdal-bindings.o init.o inverser.o local_stubs.o
ogr_geom.o ogr_polygons.o ogr_proj.o ogrdrivers.o ogrsource.o proj_info6.o
projectit.o -L/Users/dosc3612/Applications/miniconda3/envs/rgdal/lib -lgdal
-L/Users/dosc3612/Applications/miniconda3/envs/rgdal/lib -lproj
-F/Library/Frameworks/R.framework/.. -framework R -Wl,-framework
-Wl,CoreFoundation
installing to
/Library/Frameworks/R.framework/Versions/3.6/Resources/library/00LOCK-rgdal/00new/rgdal/libs
** R
** data
** inst
** byte-compile and prepare package for lazy loading
** help
*** installing help indices
** building package indices
** installing vignettes
** testing if installed package can be loaded from temporary location
Error: package or namespace load failed for ‘rgdal’ in dyn.load(file,
DLLpath = DLLpath, ...):
 unable to load shared object
'/Library/Frameworks/R.framework/Versions/3.6/Resources/library/00LOCK-rgdal/00new/rgdal/libs/rgdal.so':

dlopen(/Library/Frameworks/R.framework/Versions/3.6/Resources/library/00LOCK-rgdal/00new/rgdal/libs/rgdal.so,
6): Library not loaded: @rpath/libgdal.26.dylib
  Referenced from:
/Library/Frameworks/R.framework/Versions/3.6/Resources/library/00LOCK-rgdal/00new/rgdal/libs/rgdal.so
  Reason: image not found
Error: loading failed
Execution halted
ERROR: loading failed
* removing
‘/Library/Frameworks/R.framework/Versions/3.6/Resources/library/rgdal’

The downloaded source packages are in
‘/private/var/folders/bb/13z2kq0j01jg5q_0ypjc76grgn/T/Rtmpqaf8Yh/downloaded_packages’
Warning message:
In install.packages("rgdal", repo = "https://r-forge.r-project.org;,  :
  installation of package ‘rgdal’ had non-zero exit status
>

On Tue, Jun 11, 2019 at 10:32 AM Edzer Pebesma <
edzer.pebe...@uni-muenster.de> wrote:

> for compiling rgdal from source on OSX against gdal 3 and proj 6.1.0,
> please use the dev version on r-forge. See also
> https://github.com/r-spatial/sf/issues/1070
>
>
> On 6/9/19 9:52 PM, Dominik Schneider wrote:
> > I'm trying to install rgdal from source on a fresh r 3.6 installation.
> > i setup a conda environment to install gdal, proj, geos
> >
> > install.packages('rgdal', type = "source",
> >
> configure.args=c('--with-proj-include=/Users/dosc3612/Applications/miniconda3/envs/rgdal/include',
> > '--with-proj-lib=/Users/dosc3612/Applications/miniconda3/envs/rgdal/lib',
> >
> '--with-gdal-config=/Users/dosc3612/Applications/miniconda3/envs/rgdal/bin/gdal-config'))
> >
> > The full output is below but the first error I see is:
> > ./configure: line 2101: pkg-config: command not found
> > configure: pkg-config proj not available
> >   set PKG_CONFIG_PATH to the directory containing proj.pc
> > configure: PROJ version not determined using pkg-config proj
> >
> > which I don't understand because:
> > Sys.getenv('PKG_CONFIG_PATH')
> > [1] "/Users/dosc3612/Applications/miniconda3/envs/rgdal/lib/pkgconfig"
> >
> > and just to confirm:
> > (rgdal) Phoenix:~ $ ls `echo $PKG_CONFIG_PATH` | grep proj
> > proj.pc
> >
> > *Is there another way to set PKG_CONFIG_PATH within install.packages?*
> >
> > There are also some undefined symbols:
> > Undefined symbols for architecture x86_64:
> >   "_pj_ctx_fclose", referenced from:
> >   _main in proj_conf_test2-9a39dc.o
> >   "_pj_get_default_ctx", referenced from:
> >   _main in proj_conf_test2-9a39dc.o
> >   "_pj_open_lib", referenced from:
> >   _main in proj_conf_test2-9a39dc.o
> > ld: symbol(s) not found for architecture x86_64
> > clang: error: linker command failed with exit code 1 (use -v to see
> > invocation)
> > ./configure: line 2422: ./proj_conf_test2: No such file or directory
> >
> > I tried with gdal 2.4.1 and gdal 3.0.0 (after seeing
> >
> http://r-sig-geo.2731867.n2.nabble.com/GDAL-3-0-0-and-rgdal-tt7592822.html#a7592834
> > )
> > proj is 6.1
> > geos is 3.7.1
> >
> > What did I miss? Please advise. Thanks.
> >
> > FULL OUTPUT:
> >> install.packages('rgdal', type = "source",
> >
> configure.args=c('--with-proj-include=/Users/dosc3612/Applications/miniconda3/envs/rgdal/include',
> > '--with-proj-lib=/Users/dosc3612/Applications/miniconda3/envs/rgdal/lib',
> >
> '--with-gdal-config=/Users/dosc3612/Applications/miniconda3/envs/rgdal/bin/gd

[R-sig-Geo] install rgdal from source macOS

2019-06-09 Thread Dominik Schneider
I'm trying to install rgdal from source on a fresh r 3.6 installation.
i setup a conda environment to install gdal, proj, geos

install.packages('rgdal', type = "source",
configure.args=c('--with-proj-include=/Users/dosc3612/Applications/miniconda3/envs/rgdal/include',
'--with-proj-lib=/Users/dosc3612/Applications/miniconda3/envs/rgdal/lib',
'--with-gdal-config=/Users/dosc3612/Applications/miniconda3/envs/rgdal/bin/gdal-config'))

The full output is below but the first error I see is:
./configure: line 2101: pkg-config: command not found
configure: pkg-config proj not available
  set PKG_CONFIG_PATH to the directory containing proj.pc
configure: PROJ version not determined using pkg-config proj

which I don't understand because:
Sys.getenv('PKG_CONFIG_PATH')
[1] "/Users/dosc3612/Applications/miniconda3/envs/rgdal/lib/pkgconfig"

and just to confirm:
(rgdal) Phoenix:~ $ ls `echo $PKG_CONFIG_PATH` | grep proj
proj.pc

*Is there another way to set PKG_CONFIG_PATH within install.packages?*

There are also some undefined symbols:
Undefined symbols for architecture x86_64:
  "_pj_ctx_fclose", referenced from:
  _main in proj_conf_test2-9a39dc.o
  "_pj_get_default_ctx", referenced from:
  _main in proj_conf_test2-9a39dc.o
  "_pj_open_lib", referenced from:
  _main in proj_conf_test2-9a39dc.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see
invocation)
./configure: line 2422: ./proj_conf_test2: No such file or directory

I tried with gdal 2.4.1 and gdal 3.0.0 (after seeing
http://r-sig-geo.2731867.n2.nabble.com/GDAL-3-0-0-and-rgdal-tt7592822.html#a7592834
)
proj is 6.1
geos is 3.7.1

What did I miss? Please advise. Thanks.

FULL OUTPUT:
> install.packages('rgdal', type = "source",
configure.args=c('--with-proj-include=/Users/dosc3612/Applications/miniconda3/envs/rgdal/include',
'--with-proj-lib=/Users/dosc3612/Applications/miniconda3/envs/rgdal/lib',
'--with-gdal-config=/Users/dosc3612/Applications/miniconda3/envs/rgdal/bin/gdal-config'))
--- Please select a CRAN mirror for use in this session ---
trying URL 'https://cloud.r-project.org/src/contrib/rgdal_1.4-4.tar.gz'
Content type 'application/x-gzip' length 1687518 bytes (1.6 MB)
==
downloaded 1.6 MB

* installing *source* package ‘rgdal’ ...
** package ‘rgdal’ successfully unpacked and MD5 sums checked
** using staged installation
configure: R_HOME: /Library/Frameworks/R.framework/Resources
configure: CC: clang
configure: CXX: clang++ -std=gnu++11
configure: C++11 support available
configure: rgdal: 1.4-4
checking for /usr/bin/svnversion... yes
configure: svn revision: 833
configure: gdal-config set to
/Users/dosc3612/Applications/miniconda3/envs/rgdal/bin/gdal-config
checking gdal-config exists... yes
checking gdal-config executable... yes
checking gdal-config usability... yes
configure: GDAL: 2.4.1
checking C++11 support for GDAL >= 2.3.0... yes
checking GDAL version >= 1.11.4... yes
checking GDAL version <= 2.5 or >= 3.0... yes
checking gdal: linking with --libs only... yes
checking GDAL: gdal-config data directory readable... yes
checking GDAL:
/Users/dosc3612/Applications/miniconda3/envs/rgdal/share/gdal/pcs.csv
readable... yes
./configure: line 2101: pkg-config: command not found
configure: pkg-config proj not available
  set PKG_CONFIG_PATH to the directory containing proj.pc
configure: PROJ version not determined using pkg-config proj
configure: proj CPP flags:
 -I/Users/dosc3612/Applications/miniconda3/envs/rgdal/include
-I/Users/dosc3612/Applications/miniconda3/envs/rgdal/include
checking PROJ header API:... proj_api.h
checking proj_api.h presence and usability... yes
checking PROJ version >= 4.8.0... yes
Undefined symbols for architecture x86_64:
  "_pj_ctx_fclose", referenced from:
  _main in proj_conf_test2-9a39dc.o
  "_pj_get_default_ctx", referenced from:
  _main in proj_conf_test2-9a39dc.o
  "_pj_open_lib", referenced from:
  _main in proj_conf_test2-9a39dc.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see
invocation)
./configure: line 2422: ./proj_conf_test2: No such file or directory
checking PROJ.4: proj.db found and readable... yes
Undefined symbols for architecture x86_64:
  "_pj_ctx_fclose", referenced from:
  _main in proj_conf_test3-9f8df7.o
  "_pj_get_default_ctx", referenced from:
  _main in proj_conf_test3-9f8df7.o
  "_pj_open_lib", referenced from:
  _main in proj_conf_test3-9f8df7.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see
invocation)
./configure: line 2482: ./proj_conf_test3: No such file or directory
checking PROJ.4: conus found and readable... yes
configure: Package CPP flags:
 -I/Users/dosc3612/Applications/miniconda3/envs/rgdal/include
-I/Users/dosc3612/Applications/miniconda3/envs/rgdal/include
-DACCEPT_USE_OF_DEPRECATED_PROJ_API_H

Re: [R-sig-Geo] [DKIM] Re: Interpolating snowfall values on a Digital Elevation Model [SEC=UNCLASSIFIED]

2018-02-20 Thread Dominik Schneider
>
> The effects of spatial reference systems on interpolations and accuracy
> are minimal, and lat and long can be used.

Fair enough, thanks for sending the references. But, as far as I know, the
kriging functions in R still don't accept lat/long.



On Mon, Feb 19, 2018 at 8:54 PM, Li Jin <jin...@ga.gov.au> wrote:

> The effects of spatial reference systems on interpolations and accuracy
> are minimal, and lat and long can be used. Please see the following studies
> for details.
>
> Jiang, W., Li, J., 2013. Are Spatial Modelling Methods Sensitive to
> Spatial Reference Systems for Predicting Marine Environmental Variables,
> 20th International Congress on Modelling and Simulation: Adelaide,
> Australia, pp. 387-393.
> Jiang, W., Li, J., 2014. The effects of spatial reference systems on the
> predictive accuracy of spatial interpolation methods. Record 2014/01.
> Geoscience Australia: Canberra, pp 33. http://dx.doi.org/10.11636/
> Record.2014.001.
> Turner, A.J., Li, J., Jiang, W., 2017. Effects of Spatial Reference
> Systems on the Accuracy of Spatial Predictive Modelling along a Latitudinal
> Gradient, 22nd International Congress on Modelling and Simulation: Hobart,
> Tasmania, Australia, pp. 106-112.
>
>
> -Original Message-
> From: R-sig-Geo [mailto:r-sig-geo-boun...@r-project.org] On Behalf Of
> Dominik Schneider
> Sent: Wednesday, 14 February 2018 3:21 AM
> To: Stefano Sofia
> Cc: r-sig-geo@r-project.org
> Subject: [DKIM] Re: [R-sig-Geo] Interpolating snowfall values on a Digital
> Elevation Model
>
> You can't use a lat/long coordinate system when kriging because the
> concept of distance is ambiguous. Convert all your data a UTM grid like you
> had in your first post and it should work.
>
> Another note, It looks like you are working at 0.01 deg which is on the
> order of 1km resolution so you may find  other covariates such as aspect,
> slope, and wind sheltering/exposure, terrain roughness for estimating snow
> on the ground useful. see some of the earliest papers by Carroll, Cressie,
> and Elder.
>
> Carroll, S. S., and N. Cressie (1996), A comparison of geostatistical
> methodologies used to estimate snow water equivalent, *JAWRA Journal of the
> American Water Resources Association*, *32*(2), 267–278,
> doi:10./j.1752-1688.1996.tb03450.x.
>
> Carroll, S. S., and N. Cressie (1997), Spatial modeling of snow water
> equivalent using covariances estimated from spatial and geomorphic
> attributes, *Journal of Hydrology*, *190*(1-2), 42–59.
>
> Balk, B., and K. Elder (2000), Combining binary decision tree and
> geostatistical methods to estimate snow distribution in a mountain
> watershed, *Water Resources Research*, *36*(1), 13–26,
> doi:10.1029/1999WR900251.
>
> Erxleben, J., K. Elder, and R. Davis (2002), Comparison of spatial
> interpolation methods for estimating snow distribution in the Colorado
> Rocky Mountains, *Hydrological Processes*, *16*(18), 3627–3649,
> doi:10.1002/hyp.1239.
>
> Erickson, T. A., M. W. Williams, and A. Winstral (2005), Persistence of
> topographic controls on the spatial distribution of snow in rugged mountain
> terrain, Colorado, United States, *Water Resour. Res.*, *41*(4), W04014,
> doi:10.1029/2003WR002973.
>
>
> On Tue, Feb 13, 2018 at 3:45 AM, Stefano Sofia <
> stefano.so...@regione.marche.it> wrote:
>
> > Dear Daniel and list users,
> > I tried to follow the instructions but I encountered two kinds of errors.
> > This is a reproducibile code:
> >
> > 
> > ---
> > library(automap)
> > library(ggplot2)
> > library(gstat)
> > library(raster)
> > library(rasterVis)
> > library(rgdal)
> > library(maptools)
> >
> > ## LOADING DEM
> > ita_DEM <- getData('alt', country='ITA', mask=TRUE)
> > crs(ita_DEM) <- "+init=epsg:4326 +proj=longlat +datum=WGS84 +no_defs
> > +ellps=WGS84 +towgs84=0,0,0"
> > #ita_DEM <- as(ita_DEM, "SpatialGridDataFrame")
> > str(ita_DEM)
> >
> > ## LOADING RAINFALL DATA
> > rain_data <- data.frame(Cumulata=c(11.8, 9.0, 8.0, 36.6, 9.4),
> > Long_Cent=c(12.61874, 12.78690, 12.96756, 13.15599, 13.28157),
> > Lat_Cent=c(43.79447, 43.85185, 43.76267, 43.03470, 43.08003),
> > Altitude=c(112.20, 42.93, 36.14, 747, 465))
> >
> > stations <- data.frame(rain_data$Long_Cent, rain_data$Lat_Cent)
> > rain_data <- SpatialPointsDataFrame(stations, rain_data,
> > proj4string=CRS("+init=epsg:4326"))
> > stations <- SpatialPoints(stations,
> > proj4string=CRS("+init=epsg:4326"))
>

Re: [R-sig-Geo] Interpolating snowfall values on a Digital Elevation Model

2018-02-13 Thread Dominik Schneider
You can't use a lat/long coordinate system when kriging because the concept
of distance is ambiguous. Convert all your data a UTM grid like you had in
your first post and it should work.

Another note, It looks like you are working at 0.01 deg which is on the
order of 1km resolution so you may find  other covariates such as aspect,
slope, and wind sheltering/exposure, terrain roughness for estimating snow
on the ground useful. see some of the earliest papers by Carroll, Cressie,
and Elder.

Carroll, S. S., and N. Cressie (1996), A comparison of geostatistical
methodologies used to estimate snow water equivalent, *JAWRA Journal of the
American Water Resources Association*, *32*(2), 267–278,
doi:10./j.1752-1688.1996.tb03450.x.

Carroll, S. S., and N. Cressie (1997), Spatial modeling of snow water
equivalent using covariances estimated from spatial and geomorphic
attributes, *Journal of Hydrology*, *190*(1-2), 42–59.

Balk, B., and K. Elder (2000), Combining binary decision tree and
geostatistical methods to estimate snow distribution in a mountain
watershed, *Water Resources Research*, *36*(1), 13–26,
doi:10.1029/1999WR900251.

Erxleben, J., K. Elder, and R. Davis (2002), Comparison of spatial
interpolation methods for estimating snow distribution in the Colorado
Rocky Mountains, *Hydrological Processes*, *16*(18), 3627–3649,
doi:10.1002/hyp.1239.

Erickson, T. A., M. W. Williams, and A. Winstral (2005), Persistence of
topographic controls on the spatial distribution of snow in rugged mountain
terrain, Colorado, United States, *Water Resour. Res.*, *41*(4), W04014,
doi:10.1029/2003WR002973.


On Tue, Feb 13, 2018 at 3:45 AM, Stefano Sofia <
stefano.so...@regione.marche.it> wrote:

> Dear Daniel and list users,
> I tried to follow the instructions but I encountered two kinds of errors.
> This is a reproducibile code:
>
> 
> ---
> library(automap)
> library(ggplot2)
> library(gstat)
> library(raster)
> library(rasterVis)
> library(rgdal)
> library(maptools)
>
> ## LOADING DEM
> ita_DEM <- getData('alt', country='ITA', mask=TRUE)
> crs(ita_DEM) <- "+init=epsg:4326 +proj=longlat +datum=WGS84 +no_defs
> +ellps=WGS84 +towgs84=0,0,0"
> #ita_DEM <- as(ita_DEM, "SpatialGridDataFrame")
> str(ita_DEM)
>
> ## LOADING RAINFALL DATA
> rain_data <- data.frame(Cumulata=c(11.8, 9.0, 8.0, 36.6, 9.4),
> Long_Cent=c(12.61874, 12.78690, 12.96756, 13.15599, 13.28157),
> Lat_Cent=c(43.79447, 43.85185, 43.76267, 43.03470, 43.08003),
> Altitude=c(112.20, 42.93, 36.14, 747, 465))
>
> stations <- data.frame(rain_data$Long_Cent, rain_data$Lat_Cent)
> rain_data <- SpatialPointsDataFrame(stations, rain_data,
> proj4string=CRS("+init=epsg:4326"))
> stations <- SpatialPoints(stations, proj4string=CRS("+init=epsg:4326"))
>
> ## EXTRACT THE ELEVATION VALUES TO MY POINTS
> rain_data$ExtractedElevationValues <- extract(x=ita_DEM, y=stations)
>
> ## CREATE GRID FOR KRIGING OUTPUT
> minx <-  rain_data@bbox[1,1]
> maxx <- rain_data@bbox[1,2]
> miny <- rain_data@bbox[2,1]
> maxy <- rain_data@bbox[2,2]
> pixel <- 0.01
> grd <- expand.grid(x=seq(minx, maxx, by=pixel), y=seq(miny, maxy,
> by=pixel))
> coordinates(grd) <- ~x+y
> gridded(grd) <- TRUE
> proj4string(grd) <- CRS("+init=epsg:4326")
>
> ## KRIGING: autoKrige(YourMeasurements ~ YourExtractedElevationValues,
> YourMeasurementLocations, TargetGrid)
> OK_snow <- autoKrige(Cumulata ~ rain_data$ExtractedElevationValues,
> rain_data, grd)
> 
> ---
>
> The error I get is:
> Error in autoKrige(Cumulata ~ rain_data$ExtractedElevationValues,
> rain_data,  :
>   Either input_data or new_data is in LongLat, please reproject.
>input_data:  +init=epsg:4326 +proj=longlat +datum=WGS84 +no_defs
> +ellps=WGS84 +towgs84=0,0,0
>new_data:+init=epsg:4326 +proj=longlat +datum=WGS84 +no_defs
> +ellps=WGS84 +towgs84=0,0,0
>
> but I did pay attention to have the same reference system in rain_data,
> drg and the Digital Elevation Model.
>
> Moreover, if I impose the class of the DEM to SpatialGridDataFrame when I
> extraxt the elevation points from the DEM I get the following error:
> Error in (function (classes, fdef, mtable)  :
>   unable to find an inherited method for function ‘extract’ for signature
> ‘"SpatialGridDataFrame", "SpatialPoints"’
> Calls: extract -> 
>
>
> Would you please somebody help to show me where is my mistake?
>
> Thank you for all your attention
> Stefano
>
>  (oo)
> --oOO--( )--OOo
> Stefano Sofia PhD
> Area Meteorologica e  Area nivologica - Centro Funzionale
> Servizio Protezione Civile - Regione Marche
> Via del Colle Ameno 5
> 60126 Torrette di Ancona, Ancona
> Uff: 071 806 7743
> E-mail: stefano.so...@regione.marche.it
> ---Oo-oO
> 
> Da: R-sig-Geo 

Re: [R-sig-Geo] Error reading hdf files with rgdal

2017-01-16 Thread Dominik Schneider
Just to cover all the bases, MODIS hdf files require the hdf4 library. If
you changed something with your GDAL install you might not have this
anymore.

at the commandline prompt, gdalinfo --format HDF4 should give the following
Khione:~ $ gdalinfo --format HDF4
Format Details:
  Short Name: HDF4
  Long Name: Hierarchical Data Format Release 4
  Supports: Raster
  Extension: hdf
  Help Topic: frmt_hdf4.html
  Supports: Subdatasets
  Supports: Open() - Open existing dataset.

Dominik

On Sun, Jan 15, 2017 at 7:36 PM, Roger Bivand  wrote:

> When you see an error message from the R interpreter, include the full
> output, and the output of traceback() issued immediately after the error.
> Nothing important in raster handling has changed in rgdal. Try to provide a
> reprodicible example that you can show succeeded on the same system with
> earlier versions of packages, and which now fail. We cannot see over your
> shoulder.
>
> Roger Bivand
> Norwegian School of Economics
> Bergen, Norway
>
>
>
> On Sun, Jan 15, 2017 at 7:17 PM +0100, "Kelly McCaffrey" <
> kmccaffrey2...@my.fit.edu> wrote:
>
> Using GDAL and readGDAL both result in the error message:
> Error in .local(.Object, ...) :
> With no other message following this warning. Trying to use GDALinfo()
> results in the same warning. I'm having these problems trying to use .hdf
> files I have previously successfully accessed and transformed to geoTIFF
> files as well.
>
>
> On Fri, Jan 13, 2017 at 7:18 PM, Michael Sumner  > wrote:
>
> Try listing the subdatasets and reading them directly with GDAL or readGDAL
>
> On Sat, Jan 14, 2017, 07:12 Kelly McCaffrey  mailto:kmccaffrey2...@my.fit.edu>> wrote:
> Here are the startup messages and the sessionInfo() output:
>
> > library(raster)
> Loading required package: sp
> > library(rgdal)
> rgdal: version: 1.2-5, (SVN revision 648)
>  Geospatial Data Abstraction Library extensions to R successfully loaded
>  Loaded GDAL runtime: GDAL 2.0.1, released 2015/09/15
>  Path to GDAL shared files:
> C:/Users/kmccaffrey2011/Documents/R/win-library/3.3/rgdal/gdal
>  Loaded PROJ.4 runtime: Rel. 4.9.2, 08 September 2015, [PJ_VERSION: 492]
>  Path to PROJ.4 shared files:
> C:/Users/kmccaffrey2011/Documents/R/win-library/3.3/rgdal/proj
>  Linking to sp version: 1.2-4
> > library(gdalUtils)
> > sessionInfo()
> R version 3.3.2 (2016-10-31)
> Platform: x86_64-w64-mingw32/x64 (64-bit)
> Running under: Windows >= 8 x64 (build 9200)
>
> locale:
> [1] LC_COLLATE=English_United States.1252
> [2] LC_CTYPE=English_United States.1252
> [3] LC_MONETARY=English_United States.1252
> [4] LC_NUMERIC=C
> [5] LC_TIME=English_United States.1252
>
> attached base packages:
> [1] stats graphics  grDevices utils
> [5] datasets  methods   base
>
> other attached packages:
> [1] gdalUtils_2.0.1.7 rgdal_1.2-5
> [3] raster_2.5-8  sp_1.2-4
>
> loaded via a namespace (and not attached):
>  [1] tools_3.3.2   Rcpp_0.12.8
>  [3] R.methodsS3_1.7.1 codetools_0.2-15
>  [5] grid_3.3.2iterators_1.0.8
>  [7] foreach_1.4.3 R.utils_2.5.0
>  [9] R.oo_1.21.0   lattice_0.20-34
>
>
> Kelly
>
> On Fri, Jan 13, 2017 at 3:06 PM, Roger Bivand > wrote:
>
> > Please provide the output of the startup messages from the loaded
> > packages, and of sessionInfo().
> >
> > Roger Bivand
> > Norwegian School of Economics
> > Bergen, Norway
> >
> > Fra: Kelly McCaffrey
> > Sendt: fredag 13. januar, 20.51
> > Emne: [R-sig-Geo] Error reading hdf files with rgdal
> > Til: R-sig-geo mailing list
> >
> > Hello all, I'm working with some .hdf files containing MODIS satellite
> > data, but I have been unable to read them into R using rgdal and
> gdalUtils
> > since the last rgdal update. Has anyone else experienced these issues and
> > does anyone have a suggested fix? The code I'm attempting to use and
> error
> > message are as follows: library(raster) library(rgdal) library(gdalUtils)
> > file
> >
>
>
>
> --
>
> *Kelly McCaffrey*
> Graduate Student
> Department of Biological Sciences
> Florida Institute of Technology
> 150 W. University Blvd.
> Melbourne, FL, 32901
>
> [[alternative HTML version deleted]]
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> --
> Dr. Michael Sumner
> Software and Database Engineer
> Australian Antarctic Division
> 203 Channel Highway
> Kingston Tasmania 7050 Australia
>
>
>
>
> --
>
> Kelly McCaffrey
> Graduate Student
> Department of Biological Sciences
> Florida Institute of Technology
> 150 W. University Blvd.
> Melbourne, FL, 32901
>
>
> [[alternative HTML version deleted]]
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> 

Re: [R-sig-Geo] raster: crop error ('nrows > 0 is not TRUE')

2016-10-03 Thread Dominik Schneider
Isn't the southern border of your boundary shapefile outside the extent of
your raster? hence there are no rows to extract.

On Mon, Oct 3, 2016 at 7:45 AM, Mauricio Zambrano Bigiarini <
mauricio.zambr...@ufrontera.cl> wrote:

> Dear list,
>
> I'm applying the 'crop' command of the raster pacakge to subset a
> RasterStack ('r') with a SpatialPolygonsDataFrame ('boundary'):
>
> raster.crop <- crop(r, boundary, snap="out")
>
> and what I get is:
>
> Error: nrows > 0 is not TRUE
>
>
> The summary of the two previous objects are:
>
> > r
> class   : RasterStack
> dimensions  : 666, 211, 140526, 8  (nrow, ncol, ncell, nlayers)
> resolution  : 0.05, 0.05  (x, y)
> extent  : -76.3, -65.75, -50, -16.7  (xmin, xmax, ymin, ymax)
> coord. ref. : +proj=longlat +ellps=WGS84 +datum=WGS84 +no_defs
> +towgs84=0,0,0
> names   : Year.2003, Year.2004, Year.2005, Year.2006, Year.2007,
> Year.2008, Year.2009, Year.2010
> min values  : 0, 0, 0, 0, 0,
>   0, 0, 0
> max values  :844.79,706.53,616.73,888.75,701.53,
>  853.47,642.07,986.31
>
>
> > boundary
> class   : SpatialPolygonsDataFrame
> features: 16
> extent  : -75.693, -66.419, -55.914, -17.498  (xmin, xmax, ymin, ymax)
> coord. ref. : +proj=longlat +ellps=WGS84 +datum=WGS84 +no_defs
> +towgs84=0,0,0
> variables   : 1
> names   : NOM_REG
> min values  :   1
> max values  :   9
>
> the previous 'crop' command worked perfectly with a different
> RasterStack object, with a similar spatial extent, and I don't now
> what is wrong here.
>
>
> The output of my sessionInfo() is:
>
> sessionInfo()
> R version 3.3.1 (2016-06-21)
> Platform: x86_64-pc-linux-gnu (64-bit)
> Running under: Ubuntu 14.04.2 LTS
>
> locale:
>  [1] LC_CTYPE=en_GB.UTF-8   LC_NUMERIC=C
> LC_TIME=en_GB.UTF-8LC_COLLATE=en_GB.UTF-8
> LC_MONETARY=en_US.UTF-8LC_MESSAGES=en_GB.UTF-8
> LC_PAPER=en_US.UTF-8   LC_NAME=C
>  [9] LC_ADDRESS=C   LC_TELEPHONE=C
> LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C
>
> attached base packages:
> [1] stats graphics  grDevices utils datasets  methods   base
>
> other attached packages:
>  [1] colorspace_1.2-6reshape2_1.4.1  rasterVis_0.40
> latticeExtra_0.6-28 RColorBrewer_1.1-2  lattice_0.20-34
> rgdal_1.1-10hydroGOF_0.3-8-7hydroTSM_0.4-8  xts_0.9-7
>  zoo_1.7-13
> [12] raster_2.5-8sp_1.2-3
>
> loaded via a namespace (and not attached):
>  [1] Rcpp_0.12.6   magrittr_1.5  maptools_0.8-39
> viridisLite_0.1.3 FNN_1.1   stringr_1.0.0 plyr_1.8.4
>  tools_3.3.1   parallel_3.3.1grid_3.3.1gstat_1.1-3
>   e1071_1.6-7
> [13] class_7.3-14  intervals_0.15.1  automap_1.0-14ncdf4_1.15
>   stringi_1.1.1 spacetime_1.1-5   reshape_0.8.5
> foreign_0.8-66hexbin_1.27.1
>
>
> Any idea about how to solve this issue is highly appreciated.
>
> Thanks in advance,
>
>
>
> Mauricio Zambrano-Bigiarini, PhD
>
> =
> Department of Civil Engineering
> Faculty of Engineering and Sciences
> Universidad de La Frontera
> PO Box 54-D, Temuco, Chile
> http://hzambran.github.io/
> =
> mailto : mauricio.zambr...@ufrontera.cl
> work-phone : +56 45 259 2812
> =
> "To accomplish great things, we must not only act,
> but also dream; not only plan, but also believe"
> (Anatole France)
> =
> Linux user #454569 -- Linux Mint user
>
> --
> La información contenida en este correo electrónico y cualquier anexo o
> respuesta relacionada, puede contener datos e información confidencial y no
> puede ser usada o difundida por personas distintas a su(s) destinatario(s).
> Si usted no es el destinatario de esta comunicación, le informamos que
> cualquier divulgación, distribución o copia de esta información constituye
> un delito conforme a la ley chilena. Si lo ha recibido por error, por favor
> borre el mensaje y todos sus anexos y notifique al remitente.
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: [R-sig-Geo] Set up raster data in WRF Mercator projection

2016-08-03 Thread Dominik Schneider
Same process, just different projection.

with this folder structure:
project_dir
- data
- scripts

the following script in 'scripts', nc file in 'data'


library(ncdf4)
library(raster)

# open the nc file for reading
ncid=nc_open('../data/geo_em.d01.nc')

#check which grid the variable of interest is on
ncatt_get(ncid,'LANDMASK')#stagger=M

# domain corner grids. see
http://www2.mmm.ucar.edu/wrf/users/docs/user_guide_V3/users_guide_chap3.htm
cornerlat=ncatt_get(ncid,0)$corner_lats
cornerlong=ncatt_get(ncid,0)$corner_lons

# projected grid resolution
dx=ncatt_get(ncid,0)$DX
dy=ncatt_get(ncid,0)$DY

# the coordinates are on a sphere in longlat
proj1=CRS('+proj=longlat +towgs84=0,0,0 +a=637 +b=637 +units=m
+no_defs')

# But the data is in projected space. for a more general script, you would
build this programmatically from the attributes in geo file
wrfproj=CRS("+proj=merc +lat_ts=30 +lon_0=290 +a=637.0 +b=637.0
+units=m +no_defs")

# create spatial points
# using the extreme corners of the data since raster uses extreme
coordinates. page 3,
https://cran.r-project.org/web/packages/raster/vignettes/Raster.pdf
# there are 16 corner coordinates. 1:4 are cell centers. 13:16 are cell
extremes (see wrf user_guide linked above)
# these coordinates are in longlat so assign geographic system from above
spext <-
SpatialPoints(
list( x=cornerlong[13:16],
y=cornerlat[13:16]),
proj4string = proj1)

# convert to projected space
spext.wrf <- spTransform(spext,wrfproj)

# read the variable wanted
r=raster('../data/geo_em.d01.nc',varname='LANDMASK')
# assign projection detailed in the geo file (defined above as wrfproj)
projection(r)=wrfproj
# change extent to that of the projected spatialpoints
extent(r) <- extent(spext.wrf)
# check the raster.
r
# seem to be missing a couple meters in extent. might be a floating point
issue? you can force the resolution since you know it, but this will change
the bottom right corner of the extent
res(r)=c(dx,dy)
r

On Wed, Aug 3, 2016 at 11:51 AM, David Wang <dw2...@outlook.com> wrote:

> Hello Dominik,
>
>
> Thanks for your comment. Yes, I have read the thread that you pointed to
> prior to posting the question. It's Lambert conformal conic projection
> though. I'm not sure the way to compute the domain extent is the same.
>
>
> I have uploaded a copy of the grid file at
> https://www.dropbox.com/s/ffrxqkxr4vai8hf/geo_em.d01.nc?dl=0 if anyone
> happens to be able to take a look.
>
> Thanks,
> David
>
> --
> *From:* Dominik Schneider <dosc3...@colorado.edu>
> *Sent:* Wednesday, August 3, 2016 12:52 PM
> *To:* David Wang
> *Cc:* r-sig-geo
> *Subject:* Re: [R-sig-Geo] Set up raster data in WRF Mercator projection
>
> did you see this thread? might help:
> http://r-sig-geo.2731867.n2.nabble.com/adapting-spatial-points-and-wrld-smpl-to-a-reference-system-implicit-in-a-nc-file-tt7589570.html
>
> From my notes, the mercator WRF projection would look like this:
> Projection_String = ('PROJCS["Sphere_Mercator",'
>  'GEOGCS["GCS_Sphere",'
>  'DATUM["D_Sphere",'
>  'SPHEROID["Sphere",637.0,0.0]],'
>  'PRIMEM["Greenwich",0.0],'
>  'UNIT["Degree",0.0174532925199433]],'
>  'PROJECTION["Mercator"],'
>  'PARAMETER["False_Easting",0.0],'
>  'PARAMETER["False_Northing",0.0],'
>  'PARAMETER["Central_Meridian",' +
> str(central_meridian) + '],'
>  'PARAMETER["Standard_Parallel_1",' +
> str(standard_parallel_1) + '],'
>  'UNIT["Meter",1.0]]')
>
>
> where central_meridian = STAND_LON and standard_parallel_1=TRUELAT1
>
> so to start I think you want (changes in *bold*):
> "+proj=merc +lat_ts=30 +lon_0=*290* +a=637.0 +b=637.0 +units=*m*
>  +no_defs")
>
> if you post the data, someone might be able to help create a proper raster
> for you.
>
>
> On Wed, Aug 3, 2016 at 9:46 AM, David Wang <dw2...@outlook.com> wrote:
>
>> Hello,
>>
>>
>> I'd like to load a WRF (Weather Research and Forecasting) output file in
>> NetCDF as rasters and set up correct projection and coordinates. Using land
>> mask as an example,
>>
>>
>> landmask <- raster("geo_em_d01.nc", varname = "LANDMASK")
>>
>>
>> The projection of this particular WRF simulation is Mercator. After
>> extensive googling, I came up with this projection string:
>>
>>
>&

Re: [R-sig-Geo] Set up raster data in WRF Mercator projection

2016-08-03 Thread Dominik Schneider
did you see this thread? might help:
http://r-sig-geo.2731867.n2.nabble.com/adapting-spatial-points-and-wrld-smpl-to-a-reference-system-implicit-in-a-nc-file-tt7589570.html

>From my notes, the mercator WRF projection would look like this:
Projection_String = ('PROJCS["Sphere_Mercator",'
 'GEOGCS["GCS_Sphere",'
 'DATUM["D_Sphere",'
 'SPHEROID["Sphere",637.0,0.0]],'
 'PRIMEM["Greenwich",0.0],'
 'UNIT["Degree",0.0174532925199433]],'
 'PROJECTION["Mercator"],'
 'PARAMETER["False_Easting",0.0],'
 'PARAMETER["False_Northing",0.0],'
 'PARAMETER["Central_Meridian",' +
str(central_meridian) + '],'
 'PARAMETER["Standard_Parallel_1",' +
str(standard_parallel_1) + '],'
 'UNIT["Meter",1.0]]')


where central_meridian = STAND_LON and standard_parallel_1=TRUELAT1

so to start I think you want (changes in *bold*):
"+proj=merc +lat_ts=30 +lon_0=*290* +a=637.0 +b=637.0 +units=*m*
 +no_defs")

if you post the data, someone might be able to help create a proper raster
for you.


On Wed, Aug 3, 2016 at 9:46 AM, David Wang  wrote:

> Hello,
>
>
> I'd like to load a WRF (Weather Research and Forecasting) output file in
> NetCDF as rasters and set up correct projection and coordinates. Using land
> mask as an example,
>
>
> landmask <- raster("geo_em_d01.nc", varname = "LANDMASK")
>
>
> The projection of this particular WRF simulation is Mercator. After
> extensive googling, I came up with this projection string:
>
>
> projection(landmask) <- CRS("+proj=merc +lat_ts=30 +lon_0=-67 +a=637.0
> +b=637.0 +units=km +no_defs")
>
>
> That is based on the grid information from ncdump -h geo_em_d01.nc:
>
>
> :GRIDTYPE = "C" ;
> :DX = 27000.f ;
> :DY = 27000.f ;
> :DYN_OPT = 2 ;
> :CEN_LAT = 32.f ;
> :CEN_LON = -67.f ;
> :TRUELAT1 = 30.f ;
> :TRUELAT2 = 45.f ;
> :MOAD_CEN_LAT = 32.f ;
> :STAND_LON = 290.f ;
> :POLE_LAT = 90.f ;
> :POLE_LON = 0.f ;
> :corner_lats = 6.033165f, 52.29568f, 52.29568f, 6.033165f,
> 6.033165f, 52.29568f, 52.29568f, 6.033165f, 5.893715f, 52.38135f,
> 52.38135f, 5.893715f, 5.893715f, 52.38135f, 52.38135f, 5.893715f ;
> :corner_lons = -129.8151f, -129.8151f, -4.184856f,
> -4.184856f, -129.9554f, -129.9554f, -4.044647f, -4.044647f, -129.8151f,
> -129.8151f, -4.184856f, -4.184856f, -129.9554f, -129.9554f, -4.044647f,
> -4.044647f ;
> :MAP_PROJ = 3 ;
>
> However, I'm not entirely sure this projection is correct. Furthermore, I
> have no clue how to set extent(landmask) in km. Can anyone familiar with
> WRF give me a pointer or two?
>
>
> Thanks,
>
> David
>
> [[alternative HTML version deleted]]
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Incorrect month order in zApply function

2016-07-29 Thread Dominik Schneider
Sorry I'm not sure how to fix this, but it looks like the months are  in
alphanumeric order, indicating they are being treated as a factor. Might be
worth a bug report?



On Thursday, July 28, 2016, Thiago V. dos Santos via R-sig-Geo <
r-sig-geo@r-project.org> wrote:

> Dear all,
>
> I am using the raster package to calculate monthly averages of climatic
> variables.
>
> Essentially, this is the function I use:
>
> library(raster)
>
> # Create date sequence
> idx <- seq(as.Date("1996/1/1"), as.Date("2010/12/31"), by = "day")
>
> # Create raster stack and assign dates
> r <- raster(ncol=5, nrow=5)
> s <- stack(lapply(1:length(idx), function(x) setValues(r,
> runif(ncell(r)
> s <- setZ(s, idx)
>
> # Calculate monthly average
> x <- zApply(s, by=months, mean, name=month.abb[])
>
> names(x)
> [1] "April" "August" "December" "February" "January" "July" "June"
> [8] "March" "May" "November" "October" "September"
> getZ(x)
> [1] "April" "August" "December" "February" "January" "July" "June"
> [8] "March" "May" "November" "October" "September"
>
>
> The problem here is the output of both names(x) and getZ(x). It looks like
> a random month order is returned (even though I provide the labels), which
> makes me confused about the results.
>
>
> By performing the same calculation in a different software and comparing
> the results, I came to realize that the order of months for the results by
> raster should, in fact, be Jan-Feb-Mar-Apr-May-Jun-Jul-Aug-Sep-Oct-Nov-Dec
>
> How can I control the way raster delivers the object names after using
> zApply, in order to sort the months in the "natural" order?
>
> Greetings, -- Thiago V. dos Santos
>
> PhD student
> Land and Atmospheric Science
> University of Minnesota
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Geographically weighted regression on categorical variable

2016-07-18 Thread Dominik Schneider
Sorry I don't know the answer but you can search here:
http://grokbase.com/g/r/r-sig-geo

On Mon, Jul 18, 2016 at 12:13 PM, Guy Bayegnak 
wrote:

>
> Hi All,
>
> I am trying to perform geographically weighted regression on categorical
> variables.  The majority of answers I found on the web suggest that this is
> not doable or not recommended.  I found only one post from Roger Bivan (
> https://stat.ethz.ch/pipermail/r-help/2007-September/141586.html ) that
> indicated that it was possible, and that the R-sig-geo list is more focused
> on this kind of question. I have therefore registered, but I am not sure if
> the mailing list is "searchable".
>
> I have point data collected over a geographical area A.  My data are
> groundwater quality type. And I have 3 types. When I plot it of a map it
> looks like 2 of the types are clustered and occur next to each other.  My
> suspicion is that these two clustered type may be influenced by their
> proximity to the some potential sources.  I used the spatstats package to
> explore the data,  using L-cross function.  The result of the analysis show
> that the 3 types appear to be influenced by the source, although the two
> groundwater that are clustered appear to deviate much more from the
> theoretical L-cross.  Now I am trying to explore the relationship between
> the water types and the potential source using geographically weighted
> regression on categorical variables.  Most of the material a read deals
> with continuous variables, and tend to focus on areal (polygons) features
> rather than point features.
>
> Is there any way to perform geographically weighted regression on points
> categorical variables using R?
>
> Thanks,
> GAB
>
>
>
>
> This email and any files transmitted with it are confide...{{dropped:7}}
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] putting scalebar and north arrow on google map

2016-07-18 Thread Dominik Schneider
+1 ggsn

On Mon, Jul 18, 2016 at 9:16 AM, Bacou, Melanie  wrote:

> No self promotion, but in case you're looking for code examples using
> leaflet plugins (scalebar and minimap) available through `tmap` :
> http://tools.harvestchoice.org/ar/ (code is linked at the bottom of the
> page).
>
> --Mel.
>
>
>
> On 7/18/2016 8:36 AM, Tim Appelhans wrote:
>
>> My suggestion would be the wonderful tmap package. It is, however, using
>> OpenStreetMap rather than google for background maps AFAIK.
>>
>> Tim
>>
>> On 18.07.2016 14:32, Manuel Spínola wrote:
>>
>>> Or you can try the package ggsn with the package ggmap
>>>
>>> Manuel
>>>
>>> 2016-07-17 22:57 GMT-06:00 Chris Lusk :
>>>
>>> Hi - I've made a map to show the spatial arrangement of my plots, and
 would
 now like to add a scale bar (in hundreds of metres) and a north arrow.
 I've
 seen several solutions online, most of which threw error messages when I
 tried them. Although the solution below doesn't give an error message,
 it
 doesn't actually add anything to the map!

 Any advice would be appreciated.

 Chris
 ..

 library(ggplot2)
 library(ggmap)
 library(maps)
 library(GISTools)

 ##Coordinates of four types of plots
 Gaps <- read.table(text="lon lat
 176.431823 -38.086894
 176.4329002 -38.08855667
 176.4286505 -38.08421246
 176.4239767 -38.08297953
 176.4306576 -38.09032491
 176.4181457 -38.08283416
 176.4325089 -38.09042394
 176.4243027 -38.08215021
 176.4220882 -38.08720573
 176.430974-38.093108", header = TRUE, strip.white = TRUE)

 Understorey <- read.table(text="lon lat
 176.431523   -38.086965
 176.418408   -38.082935
 176.4244189 -38.08242069
 176.4287644 -38.08372266
 176.4240118 -38.0823004
 176.4333778 -38.08853369
 176.4217424 -38.0876122
 176.4309304 -38.09055117
 176.4324084 -38.0907023
 176.430821   -38.093687", header = TRUE, strip.white = TRUE)

 Treeferns <- read.table(text="lon lat
 176.432134  -38.086298
 176.4246547 -38.08226916
 176.4335132 -38.08801622
 176.4183628 -38.08260261
 176.429   -38.08413018
 176.4213997 -38.08710868
 176.4304333 -38.09064678
 176.4324081 -38.0909994
 176.4242689 -38.08264669
 176.429837  -38.093061", header = TRUE, strip.white = TRUE)

 Margins <- read.table(text="lon lat
 176.4225302 -38.08177849
 176.427018 -38.08879161
 176.4207786 -38.08356826
 176.4315118 -38.08251629
 176.426456 -38.08920441
 176.426010   -38.088253
 176.422526   -38.082794
 176.421399   -38.084780
 176.429339   -38.092742", header = TRUE, strip.white = TRUE)

 ## Get the map at right zoom level
 OkatainaMap <- get_googlemap(center = c(lon = 176.425, lat = -38.0845),
 maptype='hybrid', zoom=15)

 ## Plot the points on the map
 ggmap(OkatainaMap) +
geom_point(data=Gaps, aes(x=lon, y=lat), pch=15,
 colour="chartreuse1",
 size=2.5, alpha=1) +
geom_point(data=Understorey, aes(x=lon, y=lat), pch=21,
 colour="gray40",
 bg="black", size=3, alpha=1) +
geom_point(data=Treeferns, aes(x=lon, y=lat), pch=23,
 colour="gray40",
 bg="blue", size=2.5, alpha=1) +
geom_point(data=Margins, aes(x=lon, y=lat), pch=17, colour="orange",
 size=2.5, alpha=1)

 ##Add scale and north arrow (NEITHER COMMAND WORKS FOR ME)

 map.scale(x=176.42,y=-38.092, ratio=FALSE, relwidth=0.2, col="white")
 north.arrow(xb=176.435, yb=-38.079, len=0.02, lab="N", col="white")

 ...


 Dr. Chris Lusk
 Senior Research Fellow
 Environmental Research Institute
 The University of Waikato
 Private Bag 3105, Hamilton
 New Zealand / Aotearoa
 http://sci.waikato.ac.nz/sites/clusk/
 Ph 64 7 838 4205
 Senior Editor, *NZ J Botany*
 ~  ~  ~  ~  ~  ~  ~  ~  ~  ~  ~

  [[alternative HTML version deleted]]

 ___
 R-sig-Geo mailing list
 R-sig-Geo@r-project.org
 https://stat.ethz.ch/mailman/listinfo/r-sig-geo


>>>
>>>
>>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: [R-sig-Geo] raster::zonal with more than 1 zonal layer

2016-03-04 Thread Dominik Schneider
Thanks  Loïc
On Mar 3, 2016 11:15 AM, "Loïc Dutrieux" <loic.dutri...@wur.nl> wrote:

>
>
> On 03/03/2016 05:19 PM, Dominik Schneider wrote:
>
>> That looks great, thanks!
>> to close the question:
>> b=as.data.frame(zonal(dat, zoneLayer, 'mean') )
>> full_join(u,b,by=c('newVal'='zone’))
>>
>> 1 question,
>> in this line: u <- data.frame(unique(s))
>> does unique(s) give all unique combinations in the 3rd dimension of the
>> stack? that’s a neat trick.
>>
>
> I'm not 100% sure, we can check with a simplified case.
>
> r1 <- r2 <- raster()
> n <- ncell(r1)
> r1[] <- c(rep(1, n/3), rep(2, n/3), rep(3, n/3))
> r2[] <- c(rep(4, n/2), rep(5, n/2))
> s <- stack(r1, r2)
> plot(s)
>
> unique(s)
>
> So yes, it looks like unique combinations in the 3rd dimension is what
> unique() returns in the case of a multilayer raster object.
>
> Cheers,
> Loïc
>
>
>> ds
>>
>>
>>
>> On Wed, Mar 02, 2016 at 3:24 AM "Loïc Dutrieux" <">"Loïc Dutrieux"
>> > wrote:
>>
>>
>>
>> On 03/02/2016 01:48 AM, Dominik Schneider wrote:
>>  > I'd like to summarise a raster using elevation and watershed. I was
>>  > originally using extract() with a shape file and then each
>> elevation band
>>  > within each polygon but it's very slow. zonal() is much faster
>> and I can
>>  > rasterize my polygons to use it.
>>
>> Is it really much faster? I sounds very similar to what extract() does
>> under the hood.
>>
>>  > But how do I robustly combine multiple
>>  > rasterized shapefiles?
>>  > e.g.
>>  > dat=raster(matrix(runif(64),nrow=8))
>>  > z1=raster(matrix( sample(1:4, 64, replace=T),nrow=8))#represents
>> elevation
>>  > bands
>>  > z2=raster(matrix(sample(1401:1408,64,replace=T),nrow=8))#represents
>>  > watersheds
>>  > zonal(dat,z1*z2,'mean')
>>  >
>>  > this works well if you are certain that each combination of the
>> values in
>>  > z1 and z2 are unique and each combination is present. otherwise
>> it gets
>>  > messy. are there any suggestions for this use case?
>>  > ideally one could do: zonal(dat,stack(z1,z2),'mean') and all the
>>  > bookkeeping would be taken care of.
>>
>> What about:
>>
>> s <- stack(z1, z2)
>> u <- data.frame(unique(s))
>> u$newVal <- seq(nrow(u))
>> zoneLayer <- subs(s, u, by = 1:2, which = 3)
>>
>> # Let's validate
>> a <- zonal(dat,z1*z2,'mean')
>> b <- zonal(dat, zoneLayer, 'mean')
>> all.equal(b[order(b[,2]),2], a[order(a[,2]),2]) # TRUE
>>
>> Cheers,
>> Loïc
>>
>>  > my other thought is to extract all the
>>  > values into data frames and use dplyr but I was wondering if
>> there was a
>>  > raster way to do this.
>>  > Thanks
>>  > Dominik
>>  >
>>  > [[alternative HTML version deleted]]
>>  >
>>  > ___
>>  > R-sig-Geo mailing list
>>  > R-sig-Geo@r-project.org <mailto:R-sig-Geo@r-project.org>
>>  > https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>>  >
>>
>> ___
>> R-sig-Geo mailing list
>> R-sig-Geo@r-project.org <mailto:R-sig-Geo@r-project.org>
>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>>
>>
>>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: [R-sig-Geo] raster::zonal with more than 1 zonal layer

2016-03-01 Thread Dominik Schneider
How do you separate the zone values from the zonal output into the original
elv and WS values?
On Mar 1, 2016 7:58 PM, <alexander.h...@csiro.au> wrote:

> Is this what you are after?
>
>
> require(raster)
> require(spatial)
> require(sp)
> dat=raster(matrix(runif(64),nrow=8))
> z1=raster(matrix( sample(1:4, 64, replace=T),nrow=8))#represents elevation
> bands
> z2=raster(matrix(sample(1401:1408,64,replace=T),nrow=8))#represents
> watersheds
>
>
> #assign unique id for each elevation x watershed
> z1[]->elv
> z2[]->ws
> elv*100+ws->nd
> z1->z3
> z3[]<-nd
> sort(unique(nd))
>
> # do zonal
> zonal(dat,z3, 'mean')
>
>
> Cheers
> Herry
>
> -Original Message-
> From: R-sig-Geo [mailto:r-sig-geo-boun...@r-project.org] On Behalf Of
> Dominik Schneider
> Sent: Wednesday, 2 March 2016 11:48 AM
> To: Help R-Sig_Geo
> Subject: [R-sig-Geo] raster::zonal with more than 1 zonal layer
>
> I'd like to summarise a raster using elevation and watershed. I was
> originally using extract() with a shape file and then each elevation band
> within each polygon but it's very slow.   zonal() is much faster and I can
> rasterize my polygons to use it.  But how do I robustly combine multiple
> rasterized shapefiles?
> e.g.
> dat=raster(matrix(runif(64),nrow=8))
> z1=raster(matrix( sample(1:4, 64, replace=T),nrow=8))#represents elevation
> bands z2=raster(matrix(sample(1401:1408,64,replace=T),nrow=8))#represents
> watersheds
> zonal(dat,z1*z2,'mean')
>
> this works well if you are certain that each combination of the values in
> z1 and z2 are unique and each combination is present. otherwise it gets
> messy.  are there any suggestions for this use case?
> ideally one could do: zonal(dat,stack(z1,z2),'mean')  and all the
> bookkeeping would be taken care of. my other thought is to extract all the
> values into data frames and use dplyr but I was wondering if there was a
> raster way to do this.
> Thanks
> Dominik
>
> [[alternative HTML version deleted]]
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[R-sig-Geo] raster::zonal with more than 1 zonal layer

2016-03-01 Thread Dominik Schneider
I'd like to summarise a raster using elevation and watershed. I was
originally using extract() with a shape file and then each elevation band
within each polygon but it's very slow.   zonal() is much faster and I can
rasterize my polygons to use it.  But how do I robustly combine multiple
rasterized shapefiles?
e.g.
dat=raster(matrix(runif(64),nrow=8))
z1=raster(matrix( sample(1:4, 64, replace=T),nrow=8))#represents elevation
bands
z2=raster(matrix(sample(1401:1408,64,replace=T),nrow=8))#represents
watersheds
zonal(dat,z1*z2,'mean')

this works well if you are certain that each combination of the values in
z1 and z2 are unique and each combination is present. otherwise it gets
messy.  are there any suggestions for this use case?
ideally one could do: zonal(dat,stack(z1,z2),'mean')  and all the
bookkeeping would be taken care of. my other thought is to extract all the
values into data frames and use dplyr but I was wondering if there was a
raster way to do this.
Thanks
Dominik

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] adapting spatial points and wrld_smpl to a reference system implicit in a .nc file

2016-02-23 Thread Dominik Schneider
This looks like WRF  data. I just dealt
with this.
The data is on a sphere as opposed to WGS84 so you need +ellps=sphere
+a=637 +b=637 +units=m

+proj=lcc which is usually what wrf is run with.
The tricky part is:
+lat_1=25.0 +lat_2=45.0 +lat_0=38.0 +lon_0=-100.0
because every WRF run is different (the WRF Preprocessing System optimizes
the projection for the domain).
and then there is probably no shift so you need(?) +x_0=0 +y_0=0

This gives:
+proj=lcc +lat_1=25.0 +lat_2=45.0 +lat_0=38.0 +lon_0=-100.0 +ellps=sphere
+a=637 +b=637 +units=m +no_defs

But, wrf users like to give out lat and  long so you need to assign it:
+proj=longlat +ellps=sphere +a=637 +b=637 +units=m +no_defs

and then reproject the lat/long to lcc coordinates using this string:
+proj=lcc +lat_1=25.0 +lat_2=45.0 +lat_0=38.0 +lon_0=-100.0 +ellps=sphere
+a=637 +b=637 +units=m +no_defs

One word of caution, make sure you received the correct coordinates. Some
variables are run cell center while some are run at cell edge. It looks
like from your .nc file it was made by your collaborator so I assume they
are right.

That said, another word of caution, I found that the XLAT and XLONG
variables from WRF output aren't very precise. There is a "geogrid" file
from the preprocessing system that has the domain corners, resolution, nrow
and ncol from which you can make a better grid using the native projection
system (in my case it was a 4km grid). I suggest you try to get those.

I hope this helps... I have to run but wanted to save people too much head
scratching. I can get you running with more help tonight if you need.
Dominik


On Tue, Feb 23, 2016 at 11:27 AM, Agus Camacho 
wrote:

> Thanks heaps to all for your effort. If I go to another GEOSTAT ill bring
> more giant crab this time.
>
> The creator of the .nc file also looked at this webpage:
> http://www.pkrc.net/wrf-lambert.html
> It seemed like the right proj4 string might be this one:
>
> +proj=lcc +lat_1=25.0 +lat_2=45.0 +lat_0=38.0
> +lon_0=-100.0 +a=6370 +b=6370 +towgs84=0,0,0 +no_defs
>
> However this projection also does not allow me to adequately plot the
> locations on the raster.
>
> Here is the .nc file. it contains several layers.
>
>
> https://www.dropbox.com/s/yto3linsgom3zi7/results_us_future_output_none_0.nc?dl=0
>
>
>
> 2016-02-23 2:25 GMT-07:00 Michael Sumner :
>
> > On Tue, 23 Feb 2016 at 20:09 Roger Bivand  wrote:
> >
> > > On Tue, 23 Feb 2016, Alex Mandel wrote:
> > >
> > > > I made an attempt at it too. Investigating the original data, I'm not
> > > > sure that the projection information supplied is correct for the data
> > > > linked. When I load up the data in a unprojected space, the
> coordinates
> > > > don't look at all similar to any Lambert projected data I have, they
> > > > actually look like Lat/Lon in some unprojected coordinate system,
> > > > perhaps a different spheroid than expected.
> > >
> > > Does anyone have a link to the original data? Is is possible that this
> is
> > > the General Oblique Transformation used by modellers - that is
> something
> > > that feels like longlat but is recentred and oblique? Example at the
> very
> > > end of my GEOSTAT talk last year (slides 81-83):
> > >
> > > http://geostat-course.org/system/files/geostat_talk_150817.pdf
> > >
> > > Roger
> > >
> > >
> > For what it is worth, the General Oblique Transformation is not the only
> > example - it's very common for modellers to have a mesh that has the
> > "mostly-properties" of a projection, but is not actually describable with
> > standard transform + affine parameters. The main cases that I've seen are
> > polar stereographic, equal area or oblique Mercator. Often they really
> are
> > simple transforms and you can reconstruct without loss, but it's not
> > usually possible to tell without exploration. It's an interesting
> > dis-connect to see code that builds a mesh with certain properties, then
> > only stores longitudes and latitudes - when it could be done with
> standard
> > tools and be stored and used much more efficiently.
> >
> > (I've seen Lambert Conformal Conic and Lambert Azimuthal Equal Area
> > terminology conflated in this context too. )
> >
> > I'm also interested to explore the original data.
> >
> > Cheers, Mike.
> >
> >
> >
> > > >
> > > > -Alex
> > > >
> > > > On 02/22/2016 10:17 PM, Frede Aakmann Tøgersen wrote:
> > > >> Hi
> > > >>
> > > >> I tried to make it work but I had to give up. I wanted to reproject
> > the
> > > Lamberth conformal conic coordinates to long-lat but it didn't work.
> > > >>
> > > >> Perhaps someone can see what I did wrong. Here is what I did (data
> in
> > R
> > > binary format and figure in png format both attached):
> > > >>
> > > >> library(raster)
> > > >> library(maptools)
> > > >> data(wrld_simpl)
> > > >>
> > > >> r <- raster("raster.grd")
> > > >> projection(r)
> > > >> ## > 

Re: [R-sig-Geo] calculate "regional" slope

2016-02-19 Thread Dominik Schneider
Chris R,
I'd be happy to use grass7 but I can't get it to run on my mac. osx
10.11.3. I have a working qgis from kyngchaos, installed pandoc and cairo
on top of that, and disabled sip but the grass7.app just hangs when I try
to open it.
Chris E,
I will try to clarify. Rather than thinking about "side of a mountain",
think about "side of a mountain **range**".  The point of calculating a
regional slope is that if I am on the west side of a continental divide but
on the east side of mountain, the function still returns a value indicating
west-facing.  so maybe it's easier to think in meters.   lets assume my DEM
is a 500m grid. the slope calculation would give a local value at 500m.
This local slope might be east facing, but I am interested in the overall
slope across 10km to indicate which way e.g. the watershed is draining.
What do you mean with "transect along the z. I.e. roll your dem on it's
side."?


On Fri, Feb 19, 2016 at 2:50 PM, chris english <
englishchristoph...@gmail.com> wrote:

> Dominik,
> Sorry, I'm still trying to understand your 0.05 to 1.5 degree part of your
> problem.
>
> Otherwise, I think you are limited to 8 neighbors as this reflects the
> documentation as I read it.
>
> Even Roualt perhaps would be up in arms; but, there's nothing saying you
> can't do a 16 vs 8 neighbor. You'd have to examine the impacts thereafter,
> but basically you'd be amending some gdal (probably a line or two of code)
> for your purposes.
>
> There are a bunch of things to consider, theoretical and practical; but,
> why 16 better than 8. And more importantly as you relax, as a matter of
> rings (in this case), would your analytical result be better? Or
> potentially have any remainder meaning at all, I.e. I don't know my
> neighbor's neighbor's neighbor (does that get us out to 16?).
> And so generalizing beyond some given point might yield not much on a knn
> influence/likeness basis.
>
> I think we're first better off dialing back in on what you mean by
> regional or the 0.5 to 1.5 resolution and then neighborhood size (4, 8,16?).
>
> Of course another approach to this "what side of the mountain am I on" is
> to transect along the z. I.e. roll your dem on it's side.
>
> Anyway, clarify the 0.5/1.5 so I don't go too far astray.
> Chris
> Thanks for the suggestion Chris.  I'm familiar with gdaldem, which
> raster::terrain is based on, to compute slope from a dem. I now realize
> that my example isnt a good one because neighbors=8 would achieve what I
> described. However I actually want some flexibility such that I can
> specifiy neighbors=16 so that it uses the next "ring" of cells.
>
> I played around with focal() with weight argument =
> matrix(rep(c(1,0,0,0,1),5),byrow=T) but couldn't figure out how to solve
> for a directional slope.
>
> On Fri, Feb 19, 2016 at 4:09 AM, chris english <
> englishchristoph...@gmail.com> wrote:
>
>> Dominik,
>>
>> r <- raster(nrows=22, ncols=20, xmn=-58, xmx=-48, ymn=-33, ymx=-22)
>>  vals <- sample.int(1e3,440)
>> r[ ] <- vals
>> #raster::terrain
>> terr_r <- terrain(r, opt='slope', unit='degrees', neighbors=8)
>> Ah, but it appears you want up sampling to 1.5 degrees rather than 0.5
>> deg.
>> so maybe spatial.tools::projectRaster_rigorous then raster:terrain.
>>
>> I'm inclined to end that last so maybe with a question mark. Sorry for an
>> essentially inconclusive response but I was happy to find terrain in any
>> case.
>> Chris
>>
>> On Fri, Feb 19, 2016 at 2:59 AM, Dominik Schneider <
>> dominik.schnei...@colorado.edu> wrote:
>>
>>> I need to calculate slope at different scales. In the case below, r is a
>>> 0.5deg resolution raster and I want the slope for 1.5 deg centered on
>>> each
>>> of those 0.5 deg pixels. I'm trying to estimate which side of mountain
>>> range each pixel is on. So the resulting raster would have the same
>>> number
>>> of pixels as r. The edges can be NA.
>>> any suggestions would be appreciated. Thanks
>>>
>>>
>>> r <- raster(nrows=22, ncols=20, xmn=-58, xmx=-48, ymn=-33, ymx=-22)
>>> setValues(r,rnorm(440))
>>>
>>> [[alternative HTML version deleted]]
>>>
>>> ___
>>> R-sig-Geo mailing list
>>> R-sig-Geo@r-project.org
>>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>>>
>>
>>
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[R-sig-Geo] calculate "regional" slope

2016-02-18 Thread Dominik Schneider
I need to calculate slope at different scales. In the case below, r is a
0.5deg resolution raster and I want the slope for 1.5 deg centered on each
of those 0.5 deg pixels. I'm trying to estimate which side of mountain
range each pixel is on. So the resulting raster would have the same number
of pixels as r. The edges can be NA.
any suggestions would be appreciated. Thanks


r <- raster(nrows=22, ncols=20, xmn=-58, xmx=-48, ymn=-33, ymx=-22)
setValues(r,rnorm(440))

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] different projection transformation R and gdal commandline

2016-02-16 Thread Dominik Schneider
Oh, silly me. doing plot(spTransform(box,CRS(pstring))) and it's obvious
what's happening. The projection rotates the polygon such that for the
corners y1,x1 and y2,x2   y1 != y2. But ymin(spTransform(box,CRS(pstring)))
still gives you the smallest coordinate regardless of which corner it is.

Using spatial points its easier to see.

e=c(-112.25,-104.125,33,43.75)
box=as(extent(e),'SpatialPoints')
proj4string(box)='+proj=longlat +datum=WGS84'
pstring='+proj=lcc +lat_1=28 +lat_2=50 +lat_0=39.70001220694445 +lon_0=-98
+x_0=0 +y_0=0 +ellps=sphere +a=637 +b=637 +units=m +no_defs'
coordinates(spTransform(box,CRS(pstring)))
## x y
##[1,] -1306471.3 -629424.0
##[2,] -1122174.4  531025.6
##[3,]  -563451.2 -713442.3
##[4,]  -483968.2  458859.3




On Tue, Feb 16, 2016 at 9:27 AM, Dominik Schneider <
dominik.schnei...@colorado.edu> wrote:

> Hi Chris,
> Thanks for confirming this. I'm not surprised that gdalUtils gives the
> same answer as the gdal utilities - my understanding is that gdalUtils is
> basically the equivalent to calling the commandline utilities via system().
> I'm hoping that someone can shed light on spTransform since I use that a
> lot for transforming points and polygons.
> ds
>
>
> On Mon, Feb 15, 2016 at 2:50 PM, Chris Reudenbach <
> reudenb...@uni-marburg.de> wrote:
>
>> Hi Dominik,
>>
>> If you use the gdalUtils package  there is no significant difference in
>> the results using CLI or  R:
>>
>> library(gdalUtils)
>> gdaltransform(s_srs="+proj=longlat +datum=WGS84", t_srs="+proj=lcc
>> +lat_1=28 +lat_2=50 +lat_0=39.70001220694445 +lon_0=-98 +x_0=0 +y_0=0
>> +ellps=sphere  +a=637 +b=637 +units=m
>> +no_defs",coords=matrix(c(-112.25,33.0), ncol = 2))
>>  [,1]  [,2] [,3]
>> [1,] -1306676 -629522.50
>>
>>
>> If I use  spTransform I can reproduce your results:
>>
>>
>> loc <- c(-112.25,33.0)
>> loc <- data.frame(matrix(unlist(loc), nrow=1,
>> byrow=T),stringsAsFactors=FALSE)
>> colnames(loc)<-c("lon","lat")
>> coordinates(loc)<- ~lon+lat
>> proj4string(loc)<- CRS("+proj=longlat +datum=WGS84 +no_defs")
>> loc<-spTransform(loc,CRS('+proj=lcc +lat_1=28 +lat_2=50
>> +lat_0=39.70001220694445 +lon_0=-98 +x_0=0 +y_0=0 +ellps=sphere +a=637
>> +b=637 +units=m +no_defs'))
>> loc@coords
>>lon lat
>> 1 -1306471 -629424
>>
>> I suggest to focus on the sptransform() function
>>
>> cheers Chris
>>
>> > sessionInfo()
>> R version 3.2.3 (2015-12-10)
>> Platform: x86_64-pc-linux-gnu (64-bit)
>> Running under: Ubuntu 14.04.3 LTS
>>
>> locale:
>>  [1] LC_CTYPE=de_DE.UTF-8   LC_NUMERIC=C LC_TIME=de_DE.UTF-8
>> LC_COLLATE=de_DE.UTF-8
>>  [5] LC_MONETARY=de_DE.UTF-8LC_MESSAGES=de_DE.UTF-8
>> LC_PAPER=de_DE.UTF-8   LC_NAME=C
>>  [9] LC_ADDRESS=C   LC_TELEPHONE=C LC_MEASUREMENT=de_DE.UTF-8
>> LC_IDENTIFICATION=C
>>
>> attached base packages:
>> [1] stats graphics  grDevices utils     datasets  methods base
>>
>> other attached packages:
>> [1] gdalUtils_2.0.1.7 raster_2.5-2  sp_1.2-2
>>
>> loaded via a namespace (and not attached):
>>  [1] rgdal_1.1-3   tools_3.2.3   Rcpp_0.12.3 R.methodsS3_1.7.0
>> codetools_0.2-14  grid_3.2.3
>>  [7] iterators_1.0.8   foreach_1.4.3 R.utils_2.2.0 R.oo_1.19.0
>>  lattice_0.20-33
>>
>>
>>
>>
>>
>>
>> Am 15.02.2016 um 21:37 schrieb Dominik Schneider:
>>
>>> Hi,
>>> I'm struggling to use a custom projection. I am seeing differences with
>>> someone using python proj4 bindings and when I compared my R results with
>>> my commandline results I got even more confused. the coordinate
>>> transformation is different for the two different methods.
>>>
>>> could someone explain to me which one is wrong and why?
>>> Thanks
>>>
>>> R:
>>> e=c(-112.25,-104.125,33,43.75)
>>> box=as(extent(e),'SpatialPolygons')
>>> proj4string(box)='+proj=longlat +datum=WGS84'
>>> pstring='+proj=lcc +lat_1=28 +lat_2=50 +lat_0=39.70001220694445
>>> +lon_0=-98
>>> +x_0=0 +y_0=0 +ellps=sphere +a=637 +b=637 +units=m +no_defs'
>>> xmin(spTransform(box,CRS(pstring)))
>>> ## [1] -1306471
>>> ymin(spTransform(box,CRS(pstring)))
>>> ## [1] -713442.3
>>>
>>> commandline:
>>> iMac:~ $ gdaltransform -s_srs "+proj=longlat +datum=WGS84" -t_srs
>>> "+proj=lcc +lat_1=28 +lat_2=50 +lat_0=39.70001220694445 +lon_0=-98 +x_0=0
>>> +y_0
>>> =0  +ellps=sphere  +a=637 +b=637 +units=m +no_defs"
>>> -112.25 33.
>>> -1306675.75472246 -629522.472322824 0
>>>
>>> [[alternative HTML version deleted]]
>>>
>>> ___
>>> R-sig-Geo mailing list
>>> R-sig-Geo@r-project.org
>>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>>>
>>>
>>
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] different projection transformation R and gdal commandline

2016-02-16 Thread Dominik Schneider
Hi Chris,
Thanks for confirming this. I'm not surprised that gdalUtils gives the same
answer as the gdal utilities - my understanding is that gdalUtils is
basically the equivalent to calling the commandline utilities via system().
I'm hoping that someone can shed light on spTransform since I use that a
lot for transforming points and polygons.
ds


On Mon, Feb 15, 2016 at 2:50 PM, Chris Reudenbach <reudenb...@uni-marburg.de
> wrote:

> Hi Dominik,
>
> If you use the gdalUtils package  there is no significant difference in
> the results using CLI or  R:
>
> library(gdalUtils)
> gdaltransform(s_srs="+proj=longlat +datum=WGS84", t_srs="+proj=lcc
> +lat_1=28 +lat_2=50 +lat_0=39.70001220694445 +lon_0=-98 +x_0=0 +y_0=0
> +ellps=sphere  +a=637 +b=637 +units=m
> +no_defs",coords=matrix(c(-112.25,33.0), ncol = 2))
>  [,1]  [,2] [,3]
> [1,] -1306676 -629522.50
>
>
> If I use  spTransform I can reproduce your results:
>
>
> loc <- c(-112.25,33.0)
> loc <- data.frame(matrix(unlist(loc), nrow=1,
> byrow=T),stringsAsFactors=FALSE)
> colnames(loc)<-c("lon","lat")
> coordinates(loc)<- ~lon+lat
> proj4string(loc)<- CRS("+proj=longlat +datum=WGS84 +no_defs")
> loc<-spTransform(loc,CRS('+proj=lcc +lat_1=28 +lat_2=50
> +lat_0=39.70001220694445 +lon_0=-98 +x_0=0 +y_0=0 +ellps=sphere +a=637
> +b=637 +units=m +no_defs'))
> loc@coords
>lon lat
> 1 -1306471 -629424
>
> I suggest to focus on the sptransform() function
>
> cheers Chris
>
> > sessionInfo()
> R version 3.2.3 (2015-12-10)
> Platform: x86_64-pc-linux-gnu (64-bit)
> Running under: Ubuntu 14.04.3 LTS
>
> locale:
>  [1] LC_CTYPE=de_DE.UTF-8   LC_NUMERIC=C LC_TIME=de_DE.UTF-8
> LC_COLLATE=de_DE.UTF-8
>  [5] LC_MONETARY=de_DE.UTF-8LC_MESSAGES=de_DE.UTF-8
> LC_PAPER=de_DE.UTF-8   LC_NAME=C
>  [9] LC_ADDRESS=C   LC_TELEPHONE=C LC_MEASUREMENT=de_DE.UTF-8
> LC_IDENTIFICATION=C
>
> attached base packages:
> [1] stats graphics  grDevices utils datasets  methods base
>
> other attached packages:
> [1] gdalUtils_2.0.1.7 raster_2.5-2  sp_1.2-2
>
> loaded via a namespace (and not attached):
>  [1] rgdal_1.1-3   tools_3.2.3   Rcpp_0.12.3 R.methodsS3_1.7.0
> codetools_0.2-14  grid_3.2.3
>  [7] iterators_1.0.8   foreach_1.4.3 R.utils_2.2.0 R.oo_1.19.0
>  lattice_0.20-33
>
>
>
>
>
>
> Am 15.02.2016 um 21:37 schrieb Dominik Schneider:
>
>> Hi,
>> I'm struggling to use a custom projection. I am seeing differences with
>> someone using python proj4 bindings and when I compared my R results with
>> my commandline results I got even more confused. the coordinate
>> transformation is different for the two different methods.
>>
>> could someone explain to me which one is wrong and why?
>> Thanks
>>
>> R:
>> e=c(-112.25,-104.125,33,43.75)
>> box=as(extent(e),'SpatialPolygons')
>> proj4string(box)='+proj=longlat +datum=WGS84'
>> pstring='+proj=lcc +lat_1=28 +lat_2=50 +lat_0=39.70001220694445 +lon_0=-98
>> +x_0=0 +y_0=0 +ellps=sphere +a=637 +b=637 +units=m +no_defs'
>> xmin(spTransform(box,CRS(pstring)))
>> ## [1] -1306471
>> ymin(spTransform(box,CRS(pstring)))
>> ## [1] -713442.3
>>
>> commandline:
>> iMac:~ $ gdaltransform -s_srs "+proj=longlat +datum=WGS84" -t_srs
>> "+proj=lcc +lat_1=28 +lat_2=50 +lat_0=39.70001220694445 +lon_0=-98 +x_0=0
>> +y_0
>> =0  +ellps=sphere  +a=637 +b=637 +units=m +no_defs"
>> -112.25 33.
>> -1306675.75472246 -629522.472322824 0
>>
>> [[alternative HTML version deleted]]
>>
>> ___
>> R-sig-Geo mailing list
>> R-sig-Geo@r-project.org
>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>>
>>
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] different projection transformation R and gdal commandline

2016-02-15 Thread Dominik Schneider
Sorry, forgot to add the details. rgdal and gdal 1.11.3 were installed from
kyngchaos.com

> sessionInfo()
R version 3.2.3 (2015-12-10)
Platform: x86_64-apple-darwin14.5.0 (64-bit)
Running under: OS X 10.11.3 (El Capitan)

locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8

attached base packages:
[1] parallel  stats graphics  grDevices utils datasets  methods
base

other attached packages:
 [1] rasterVis_0.37  latticeExtra_0.6-26 lattice_0.20-33
broom_0.4.0
 [5] tidyr_0.3.1 dplyr_0.4.3 doMC_1.3.4
 iterators_1.0.8
 [9] foreach_1.4.3   ncdf4_1.15  gridExtra_2.0.0
spdep_0.5-92
[13] Matrix_1.2-3ipred_0.9-5 MASS_7.3-45
RColorBrewer_1.1-2
[17] rgdal_1.0-7 stringr_1.0.0   ggplot2_2.0.0   plyr_1.8.3

[21] reshape2_1.4.1  raster_2.5-2sp_1.2-1
 gstat_1.1-0
[25] ProjectTemplate_0.6

loaded via a namespace (and not attached):
 [1] zoo_1.7-12   splines_3.2.3colorspace_1.2-6 spacetime_1.1-5
 [5] survival_2.38-3  prodlim_1.5.7hexbin_1.27.1DBI_0.3.1
 [9] lava_1.4.1   munsell_0.4.2gtable_0.1.2 codetools_0.2-14
[13] coda_0.18-1  psych_1.5.8  labeling_0.3 class_7.3-14
[17] xts_0.9-7Rcpp_0.12.2  scales_0.3.0 FNN_1.1
[21] deldir_0.1-9 mnormt_1.5-3 digest_0.6.8 stringi_1.0-1
[25] grid_3.2.3   tools_3.2.3  LearnBayes_2.15  magrittr_1.5
[29] lazyeval_0.1.10  assertthat_0.1   R6_2.1.1 rpart_4.1-10
[33] boot_1.3-17  intervals_0.15.1 nnet_7.3-11  nlme_3.1-122


Dominik Schneider
c 518.956.3978


On Mon, Feb 15, 2016 at 1:37 PM, Dominik Schneider <
dominik.schnei...@colorado.edu> wrote:

> Hi,
> I'm struggling to use a custom projection. I am seeing differences with
> someone using python proj4 bindings and when I compared my R results with
> my commandline results I got even more confused. the coordinate
> transformation is different for the two different methods.
>
> could someone explain to me which one is wrong and why?
> Thanks
>
> R:
> e=c(-112.25,-104.125,33,43.75)
> box=as(extent(e),'SpatialPolygons')
> proj4string(box)='+proj=longlat +datum=WGS84'
> pstring='+proj=lcc +lat_1=28 +lat_2=50 +lat_0=39.70001220694445 +lon_0=-98
> +x_0=0 +y_0=0 +ellps=sphere +a=637 +b=637 +units=m +no_defs'
> xmin(spTransform(box,CRS(pstring)))
> ## [1] -1306471
> ymin(spTransform(box,CRS(pstring)))
> ## [1] -713442.3
>
> commandline:
> iMac:~ $ gdaltransform -s_srs "+proj=longlat +datum=WGS84" -t_srs
> "+proj=lcc +lat_1=28 +lat_2=50 +lat_0=39.70001220694445 +lon_0=-98 +x_0=0
> +y_0
> =0  +ellps=sphere  +a=637 +b=637 +units=m +no_defs"
> -112.25 33.
> -1306675.75472246 -629522.472322824 0
>
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[R-sig-Geo] different projection transformation R and gdal commandline

2016-02-15 Thread Dominik Schneider
Hi,
I'm struggling to use a custom projection. I am seeing differences with
someone using python proj4 bindings and when I compared my R results with
my commandline results I got even more confused. the coordinate
transformation is different for the two different methods.

could someone explain to me which one is wrong and why?
Thanks

R:
e=c(-112.25,-104.125,33,43.75)
box=as(extent(e),'SpatialPolygons')
proj4string(box)='+proj=longlat +datum=WGS84'
pstring='+proj=lcc +lat_1=28 +lat_2=50 +lat_0=39.70001220694445 +lon_0=-98
+x_0=0 +y_0=0 +ellps=sphere +a=637 +b=637 +units=m +no_defs'
xmin(spTransform(box,CRS(pstring)))
## [1] -1306471
ymin(spTransform(box,CRS(pstring)))
## [1] -713442.3

commandline:
iMac:~ $ gdaltransform -s_srs "+proj=longlat +datum=WGS84" -t_srs
"+proj=lcc +lat_1=28 +lat_2=50 +lat_0=39.70001220694445 +lon_0=-98 +x_0=0
+y_0
=0  +ellps=sphere  +a=637 +b=637 +units=m +no_defs"
-112.25 33.
-1306675.75472246 -629522.472322824 0

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] How to perform cross-year date operations on rasters?

2015-11-05 Thread Dominik Schneider
Could you use water years for your dates? so Oct-Sep is the same year and
then subset based on month and water year.


On Thu, Nov 5, 2015 at 11:17 AM, Thiago V. dos Santos <
thi_vel...@yahoo.com.br> wrote:

> Thanks for your input Michael. With slight modifications on your
> suggestion, I almost got there:
>
> library(raster)
> library(zoo)
>
> # Create a rasterStack similar to cmip5 - same dimensions and layer names
> r <- raster(ncol=180, nrow=90)
> s <- stack(lapply(1:1825, function(x) setValues(r, runif(ncell(r)
> idx <- seq(as.Date("2010/1/1"), by = "day", length.out = 1825)
> s <- setZ(s, idx)
>
> # Separate layers with months of interest
> ldates <- format(getZ(s), "%m") %in% c("10", "11", "12", "01")
> s2 <- subset(s, which(ldates))
>
> # Apply function
> s3 <- zApply(s2, by=as.yearmon, fun = sum)
>
>
> The problem now is that the "isolated" january in the first year is also
> taken into account, and the "isolated" oct-nov-dec in the last year as well.
>
> Ideally the function would run only on the contiguous period:
> oct-nov-dec-jan, and not when at least one month is missing (for example in
> the first and last years of the series).
>
> Still possible to achieve this?
> Greetings,
>  -- Thiago V. dos Santos
>
> PhD student
> Land and Atmospheric Science
> University of Minnesota
>
>
>
> On Thursday, November 5, 2015 12:15 AM, Michael Sumner 
> wrote:
>
>
>
>
>
>
>
> On Thu, 5 Nov 2015 at 09:38 Thiago V. dos Santos 
> wrote:
>
> Dear all,
> >
> >Consider that I have a raster stack with daily values. How can I perform
> date operations covering a time interval that crosses years?
> >
> >For example, I want to sum the values from every october-to-january
> period in this sample raster:
> >
> >library(raster)
> >
> ># Create a rasterStack similar to cmip5 - same dimensions and layer names
> >r <- raster(ncol=180, nrow=90)
> >s <- stack(lapply(1:1825, function(x) setValues(r, runif(ncell(r)
> >
> ># Apply time stamps to raster
> >#x <- as.Date(c("2010-01-01","2014-12-31"),format="%Y-%m-%d")
> >#difftime(x[2], x[1], units="days")
> >idx <- seq(as.Date("2010/1/1"), by = "day", length.out = 1825)
> >s <- setZ(s, idx)
> >s
> >
> >
> >
>
>
> You can subset on the dates, by running a test on the date values:
>
> ldates <- format(getZ(s), "%m") %in% c("10", "11", "01")
>
> subsetting the object
>
> subset(s, which(ldates))
>
> and finally calculating what you want
>
> calc(subset(s, which(ldates)), sum)
>
> HTH
>
>
>
>
>  Thanks in advance,
> > -- Thiago V. dos Santos
> >
> >PhD student
> >Land and Atmospheric Science
> >University of Minnesota
> >
> >___
> >R-sig-Geo mailing list
> >R-sig-Geo@r-project.org
> >https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> >
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Raster multi-band time series analysis

2015-10-26 Thread Dominik Schneider
Check out the spatial.tools package for parallel processing with the raster
package.



On Mon, Oct 26, 2015 at 7:22 AM, Loïc Dutrieux  wrote:

> Hi Victor,
>
> I don't have much experience with beginCluster; however, I have written a
> parallel version of raster::calc a little while ago that uses forking.
> https://github.com/dutri001/bfastSpatial/blob/master/R/mc.calc.R
>
> calc and overlay are analogue so that it shouldn't be too hard to extend
> this function to overlay, using an undefined number of rasterStack or
> Bricks as input. You will need a bit more checks and control flow at the
> beginning of the function though.
> I'm not sure at all whether this is the most efficient way of
> parallelizing things, but it will for sure speed up your processing
> compared to single core processing.
>
> Hope this helps,
> Cheers,
> Loïc
>
>
> On 10/26/2015 12:23 PM, Victor Maus wrote:
>
>> Hi Loïc,
>>
>> Thank your very much for your answer. I am working on your suggestion.
>> But I still have a question.
>>
>> The function raster::overlay works for multiple RasterBrick objects.
>> However, I noticed that raster::beginCluster works with raster::overlay
>> as long as a single RasterStack or RasterBrick is provided as the first
>> argument. Do you have any suggestion for parallel processing of multiple
>> RasterBrick objects?
>>
>> Thank you!
>>
>> Best,
>> Victor
>>
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: [R-sig-Geo] Convert rasters to data frame with time stamp

2015-10-16 Thread Dominik Schneider
I don't think I completely understand but can you use tidyr::gather to get
what you want? if you have multiple datasets you could join them all
together at the end.

library(raster)
library(tidyr)

#Create a rasterStack similar to my data - same dimensions and layer names
r <- raster(ncol=60, nrow=60)
s <- stack(lapply(1:408, function(x) setValues(r, runif(ncell(r)
names(s) <- paste0('X', seq(as.Date("1980/1/1"), by = "month", length.out =
408))
s
dF=as.data.frame(s)
dF2=gather(dF,date)



On Fri, Oct 16, 2015 at 2:39 PM, Thiago V. dos Santos <
thi_vel...@yahoo.com.br> wrote:

> Dear list,
>
> Generally speaking, I love R. However, one of the things I like least in R
> is the need to interchange between the various data formats required by
> different packages.
>
> I am trying to apply a bias-correction function on some gridded climate
> data. The qmap package has functions to perform bias correction on climate
> data, but the problem I am grasping with is that it requires data to be
> organized as data.frames:
>
>
> library(qmap)
> data(obsprecip)
> data(modprecip)
>
> #Fit a quantile mapping function to observed and modeled data
> qm.fit <- fitQmap(obsprecip,modprecip,
>   method="QUANT",qstep=0.01)
>
> #Perform bias correction on modeled data
> qm <- doQmap(modprecip, qm.fit, type="tricub")
>
>
> And that's all. But notice that both observed and modeled data in this
> example are data frames for different locations (Moss, Geiranger and
> Barkestad):
>
> > head(obsprecip)
>  MOSS GEIRANGER BARKESTAD
> 1-1-1961  0.1 0 0
> 2-1-1961  0.2 0 0
> 3-1-1961  0.9 0 0
> 4-1-1961 10.6 0 0
> 5-1-1961  1.5 0 0
> 6-1-1961  1.2 0 2
>
> > head(modprecip)
>MOSS GEIRANGER BARKESTAD
> 2-1-1961  2.2830.  3.177000
> 3-1-1961  2.443   10.8600  1.719000
> 4-1-1961  3.099   12.7300  6.636000
> 5-1-1961  0.0009.7720  9.676000
> 6-1-1961  0.1400.6448  7.11
> 7-1-1961 13.4703.3570  0.001107
>
>
>
> Now, let's back to my problem. I have monthly precip data to which I want
> to apply the same function above, but my data is gridded:
>
> library(raster)
>
> #Create a rasterStack similar to my data - same dimensions and layer names
> r <- raster(ncol=60, nrow=60)
> s <- stack(lapply(1:408, function(x) setValues(r, runif(ncell(r)
> names(s) <- paste0('X', seq(as.Date("1980/1/1"), by = "month", length.out
> = 408))
> s
>
>
> Therefore, I need to load data as rasters and iterate through all
> individual gridcells to create a data frame containing:
>
>
> date1, cell1, cell2, cell3, ..., cell3600
> date2, cell1, cell2, cell3, ..., cell3600
> date3, cell1, cell2, cell3, ..., cell3600...
> date408, cell1, cell2, cell3, ..., cell3600
>
> then apply the fit function and finally convert the data back to a raster.
>
>
> Any ideas on how to efficiently convert rasters to data frames containing
> their time stamp and then back to a raster again??
>
> Any hint is much appreciated.
> Greetings,
> -- Thiago V. dos Santos
>
> PhD student
> Land and Atmospheric Science
> University of Minnesota
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] how to read in R a big raster of about 6 Gb

2015-10-16 Thread Dominik Schneider
Michael, is the tiling issue also a problem if its tiled by layer? So for a
netcdf it is my understanding that extracting a layer is faster if the
tiles are nrow x ncol x 1 instead of something like nrow/2 x ncol/2 x 1
Domink
On Oct 16, 2015 01:42, "Michael Sumner" <mdsum...@gmail.com> wrote:

>
> Fwiw, the worst slow downs I see with extract are when the raster is
> natively tiled.
> (Been meaning to write the fix and/or report to Robert).
>
> Can you check if your file is tiled? Use gdalinfo command line utility.
>
> Extract, at least for points and I assume  also for polygons, scans the
> file line by line but this needs to be done tile by tile to avoid
> repetitive and wasteful I/O.
>
> Cheers, Mike
>
>
> On Friday, October 16, 2015, Dominik Schneider <
> dominik.schnei...@colorado.edu> wrote:
> >>
> >> I need finally to associate the DEM values with the levels represented
> by
> >> one of these fields:
> >> do you think is a good idea eliminating the not necessary fields?
> >
> > if you mean that you don't need information for all the polygons then
> yes,
> > you should subset the spatialpolygonsdF. If you're just concerned about
> > multiple attribute fields then I don't think it'll make a difference.
> >
> > I'm not quite sure I'm completely grasping your hint about
> >>  values(DEM) <- 1:ncell(DEM)
> >> for speeding up the process
> >
> > DEMcopy=DEM
> > values(DEMcopy) <- 1:ncell(DEMcopy)
> > list_cells=extract(DEMcopy,sppolydF)
> >
> > list_cells will be a list with the cell numbers associated with each
> > polygon in a separate list element.  The initial extract() will not take
> > less time, but it should be useful in the future, e.g. you could do
> > poly1=list_cells[[1]]
> > elev1=DEM[poly1]
> >
> >
> > Another option I've had luck with is rasterizing polygons. Using the GDAL
> > commandline utility gdal_rasterize you can convert the shape file of
> > polygons to a classified raster.   In R, you can then use zonal() with
> some
> > aggregation function to get stats of DEM using your classified raster.
> >
> >
> >
> >
> >
> > On Thu, Oct 15, 2015 at 10:31 AM, Massimo Bressan <
> mbres...@arpa.veneto.it>
> > wrote:
> >
> >> hi, thanks for your prompt reply
> >>
> >> in fact by using raster() funciontion "reading" DEM is quite fast and
> >> also reading the polygons by readOGR() went quite smooth...
> >>
> >> the problem then come with the extract() function that is taking
> >> definitely too
> >> long...
> >>
> >> I'm not quite sure I'm completely grasping your hint about
> >>  values(DEM) <- 1:ncell(DEM)
> >> for speeding up the process
> >>
> >> please also consider that the SpatialPolygonDataFrame that I readOGR()
> has
> >> 23 fields;
> >> I need finally to associate the DEM values with the levels represented
> by
> >> one of these fields:
> >> do you think is a good idea eliminating the not necessary fields?
> >>
> >> max
> >>
> >>
> >>
> >> Il giorno Thu, 15/10/2015 alle 09.26 -0600, Dominik Schneider ha
> >> scritto:
> >> > You can use the raster package,  which will leave the values on disk
> >> > if they are too big for memory.
> >> > Use the function raster() to read the DEM, then use readOGR() to read
> >> > the polygon shape file. You can then use extract() to get statistics
> >> > on your DEM. I'm guessing it'll take a while to extract from disk, so
> >> > if you are planning to extract multiple times, you should consider
> >> > copying your DEM and setting the values equal to the cell numbers with
> >> > values(DEM) <- 1:ncell(DEM).  Extract this once, keep track of which
> >> > cell numbers are in which polygon and then use the cell numbers to
> >> > index the DEM when you need values by polygon.
> >> > Hope this helps,
> >> > Dominik
> >> >
> >> >
> >> >
> >> > On Thu, Oct 15, 2015 at 9:07 AM, Massimo Bressan
> >> > <mbres...@arpa.veneto.it> wrote:
> >> > hi all
> >> >
> >> >
> >> > I need to perform a zonal statistics by overlapping a DEM
> >> > raster (ESRI
> >> > grid binary) with some polygons (ESRI polygons);
> >> >
> >> > the problem is that I first of all need to read

Re: [R-sig-Geo] how to read in R a big raster of about 6 Gb

2015-10-15 Thread Dominik Schneider
You can use the raster package,  which will leave the values on disk if
they are too big for memory.
Use the function raster() to read the DEM, then use readOGR() to read the
polygon shape file. You can then use extract() to get statistics on your
DEM. I'm guessing it'll take a while to extract from disk, so if you are
planning to extract multiple times, you should consider copying your DEM
and setting the values equal to the cell numbers with values(DEM) <-
1:ncell(DEM).  Extract this once, keep track of which cell numbers are in
which polygon and then use the cell numbers to index the DEM when you need
values by polygon.
Hope this helps,
Dominik


On Thu, Oct 15, 2015 at 9:07 AM, Massimo Bressan 
wrote:

> hi all
>
>
> I need to perform a zonal statistics by overlapping a DEM raster (ESRI
> grid binary) with some polygons (ESRI polygons);
>
> the problem is that I first of all need to read the raster, which is
> quite big: about 6 Gbyte;
>
> I know about the function readGDAL() by the fantastic package "rgdal"
> which is usually working very well with smaller size files but this time
> I got completely stuck (the pc freezes invariably)
>
> do you have any advice on how to deal with such a problem (big file)?
>
> thank you
>
> regards
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] how to read in R a big raster of about 6 Gb

2015-10-15 Thread Dominik Schneider
>
> I need finally to associate the DEM values with the levels represented by
> one of these fields:
> do you think is a good idea eliminating the not necessary fields?

if you mean that you don't need information for all the polygons then yes,
you should subset the spatialpolygonsdF. If you're just concerned about
multiple attribute fields then I don't think it'll make a difference.

I'm not quite sure I'm completely grasping your hint about
>  values(DEM) <- 1:ncell(DEM)
> for speeding up the process

DEMcopy=DEM
values(DEMcopy) <- 1:ncell(DEMcopy)
list_cells=extract(DEMcopy,sppolydF)

list_cells will be a list with the cell numbers associated with each
polygon in a separate list element.  The initial extract() will not take
less time, but it should be useful in the future, e.g. you could do
poly1=list_cells[[1]]
elev1=DEM[poly1]


Another option I've had luck with is rasterizing polygons. Using the GDAL
commandline utility gdal_rasterize you can convert the shape file of
polygons to a classified raster.   In R, you can then use zonal() with some
aggregation function to get stats of DEM using your classified raster.





On Thu, Oct 15, 2015 at 10:31 AM, Massimo Bressan <mbres...@arpa.veneto.it>
wrote:

> hi, thanks for your prompt reply
>
> in fact by using raster() funciontion "reading" DEM is quite fast and
> also reading the polygons by readOGR() went quite smooth...
>
> the problem then come with the extract() function that is taking
> definitely too
> long...
>
> I'm not quite sure I'm completely grasping your hint about
>  values(DEM) <- 1:ncell(DEM)
> for speeding up the process
>
> please also consider that the SpatialPolygonDataFrame that I readOGR() has
> 23 fields;
> I need finally to associate the DEM values with the levels represented by
> one of these fields:
> do you think is a good idea eliminating the not necessary fields?
>
> max
>
>
>
> Il giorno Thu, 15/10/2015 alle 09.26 -0600, Dominik Schneider ha
> scritto:
> > You can use the raster package,  which will leave the values on disk
> > if they are too big for memory.
> > Use the function raster() to read the DEM, then use readOGR() to read
> > the polygon shape file. You can then use extract() to get statistics
> > on your DEM. I'm guessing it'll take a while to extract from disk, so
> > if you are planning to extract multiple times, you should consider
> > copying your DEM and setting the values equal to the cell numbers with
> > values(DEM) <- 1:ncell(DEM).  Extract this once, keep track of which
> > cell numbers are in which polygon and then use the cell numbers to
> > index the DEM when you need values by polygon.
> > Hope this helps,
> > Dominik
> >
> >
> >
> > On Thu, Oct 15, 2015 at 9:07 AM, Massimo Bressan
> > <mbres...@arpa.veneto.it> wrote:
> > hi all
> >
> >
> > I need to perform a zonal statistics by overlapping a DEM
> > raster (ESRI
> > grid binary) with some polygons (ESRI polygons);
> >
> > the problem is that I first of all need to read the raster,
> > which is
> > quite big: about 6 Gbyte;
> >
> > I know about the function readGDAL() by the fantastic package
> > "rgdal"
> > which is usually working very well with smaller size files but
> > this time
> > I got completely stuck (the pc freezes invariably)
> >
> > do you have any advice on how to deal with such a problem (big
> > file)?
> >
> > thank you
> >
> > regards
> >
> > ___
> > R-sig-Geo mailing list
> > R-sig-Geo@r-project.org
> > https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> >
> >
>
>
>
>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Help needed with extraction of raster statistics by polygons

2015-08-25 Thread Dominik Schneider
Denys,
The GIS functions in R have great utility, but speed is an issue; extract()
in particular. So, if you make a 30m raster equivalent to the Grid30m but
with the cell numbers as the values, you can use extract just once to
determine which cells are to be extracted with each polygon. Then use
regular indexing to find the mean for the raster cells covered by each
polygon. This should be much faster.
Dominik

Dominik Schneider
o 303.735.6296 | c 518.956.3978


On Tue, Aug 25, 2015 at 9:26 AM, Denys Dukhovnov via R-sig-Geo 
r-sig-geo@r-project.org wrote:

 Hello all!

 I need to spatially calculate 30-meter grid raster mean over US Census
 blocks in North East region (approx. 1.9 mln street blocks/polygons). I
 came up with the script that runs in parallel, but it still takes days to
 complete, even when the i7 CPU is used up to the max. My question: is there
 any other way to improve the processing speed given the set-up below (I
 included the backbone of the script to save space). I am new to R, so any
 help will be much appreciated. Thanks a lot!

 Regards,
 Denys
 

 library(sp)
 library(rgdal)
 library(raster)
 library(foreach)
 library(spatial.tools)

 Grid30m - raster(raster.tif)# loading main raster into R
 NEfips - c(09, 10, 11, )   # list of state FIPS codes, whose
 street block polygons are to be processed
 ShapePath - T:\\...  # path to block shapefiles

 sfQuickInit(cpus=8) # setting up and registering parallel
 processing back-end

 for (x in NEfips) {
 State - paste(block_, x , sep=)
 Blocks - readOGR(dsn=ShapePath, layer=State[1],
 stringsAsFactors=F)# loading the block polygons
 BlocksReproject - spTransform(Blocks, crs(Grid30m))

 GridMean - foreach (i=1:length(BlocksReproject$GEOID10),
 .combine='c', .packages=raster) %dopar% {# Parallel processing
 loop for extracting mean raster value per street block polygon
extract(Grid30m, BlocksReproject[i], fun=mean)
 }
 write.table(...)# Generating output containing the mean raster
 statistics for each block polygon.
 }

 [[alternative HTML version deleted]]

 ___
 R-sig-Geo mailing list
 R-sig-Geo@r-project.org
 https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] georeferencing image from scratch

2015-08-17 Thread Dominik Schneider
Marcelo,
did you try alignExtent() in the raster package?


Dominik Schneider
o 303.735.6296 | c 518.956.3978


On Mon, Aug 17, 2015 at 8:34 AM, Loïc Dutrieux loic.dutri...@wur.nl wrote:

 Hi Marcelo,

 I don't see anything in raster or rgdal that would do that directly.
 However, this can be done I believe using gdal_translate / gdal_edit (for
 adding the GCPs to the dataset) + gdalwarp (for warping). Both
 gdal_translate and gdalwarp utilities have wrappers in the gdalUtils
 package.

 Cheers,
 Loïc Dutrieux


 On 08/17/2015 11:43 AM, Marcelo Kittlein wrote:


 Hi all

 I want to change a MSS landsat image which has bad georeferencing.

 I have a grid of points with correct georeferences and want to use them
 to reproject the image to a correct reference system.

 Is there some way to this in R using raster or other package?

 in idrisi I used to resample the image using a correspondence file
 (.cor) with xold yold xnew ynew values.

 I wonder if there is something similar I can do in R? Thanks in advance

 Marcelo Kittlein

 ___
 R-sig-Geo mailing list
 R-sig-Geo@r-project.org
 https://stat.ethz.ch/mailman/listinfo/r-sig-geo


 ___
 R-sig-Geo mailing list
 R-sig-Geo@r-project.org
 https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[R-sig-Geo] imprecise location extraction

2015-06-25 Thread Dominik Schneider
Hi,
I'm working on interpolating spatial point data to a grid and as I was
troubleshooting I found that the gridded locations I am predicting to are
not what I expected. In particularly, when I extracted the locations from
the raster grid to get the cell coordinates in which the locations lay, the
latitudes differed by more than 1/2 the cell resolution.
From some visual inspection, it appears that the station points are at the
edge of two cells (although visually definitely in neighboring cells).  So
is this difference a function of floating point precision? or is something
else going?
Thanks for reading!


library(raster)
library(ggplot2)

res=15/3600
lat=seq(43.75-res/2,33+res/2,-15/3600)
long=seq(-112.25+res/2,-104.125-res/2,15/3600)
ll=expand.grid(long,lat)
rlong=raster(matrix(ll$Var1,byrow=T,nrow=2580))
rlat=raster(matrix(ll$Var2,byrow=T,nrow=2580))
coords=stack(rlong,rlat)
projection(coords)='+init=epsg:4326'
extent(coords)=c(-112.25,-104.125,33,43.75)
coords


 dput(snotellocs.sub)
new(SpatialPointsDataFrame
, data = structure(list(network = c(SNTL, SNTL), State = c(AZ,
AZ
), Station_ID = c(11P08S, 10S01S), Site_ID = c(927L, 877L
), Site_Name = c(SNOWSLIDE CANYON, WORKMAN CREEK), Elevation_ft =
c(9730L,
6900L), Elevation_m = c(2966L, 2103L)), .Names = c(network,
State, Station_ID, Site_ID, Site_Name, Elevation_ft,
Elevation_m), row.names = c(80L, 83L), class = data.frame)
, coords.nrs = c(7L, 6L)
, coords = structure(c(-111.65058, -110.91773, 35.3416, 33.81242), .Dim
= c(2L,
2L), .Dimnames = list(NULL, c(Longitude, Latitude)))
, bbox = structure(c(-111.65058, 33.81242, -110.91773, 35.3416), .Dim =
c(2L,
2L), .Dimnames = list(c(Longitude, Latitude), c(min, max
)))
, proj4string = new(CRS
, projargs = +proj=longlat +datum=WGS84 +ellps=WGS84 +towgs84=0,0,0
)
)

tmp=data.frame(coordinates(snotellocs.sub),extract(coords,snotellocs.sub))

ggplot()+
geom_raster(data=coord.df,aes(x=Long,y=Lat,fill=rnorm(nrow(coord.df+
geom_point(data=tmp,aes(x=Longitude,y=Latitude),shape=1)+
geom_point(data=tmp,aes(x=layer.1,y=layer.2),shape=3)+
# coord_cartesian(ylim=c(35.32,35.35),xlim=c(-111.6490,-111.653))
coord_cartesian(ylim=c(33.8,33.82),xlim=c(-110.925,-110.91))
## each coord_cartesian is for a different point


 sessionInfo()
R version 3.2.0 (2015-04-16)
Platform: x86_64-apple-darwin13.4.0 (64-bit)
Running under: OS X 10.9.5 (Mavericks)

locale:
[1] en_US/en_US/en_US/C/en_US/en_US

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base

other attached packages:
[1] ggplot2_1.0.1 raster_2.3-40 sp_1.1-0

loaded via a namespace (and not attached):
 [1] Rcpp_0.11.6  lattice_0.20-31  digest_0.6.8 MASS_7.3-40
 [5] grid_3.2.0   plyr_1.8.2   gtable_0.1.2 magrittr_1.5
 [9] scales_0.2.4 stringi_0.4-1reshape2_1.4.1   proto_0.3-10
[13] tools_3.2.0  stringr_1.0.0munsell_0.4.2colorspace_1.2-6

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Plotting box plots boxplots

2015-06-10 Thread Dominik Schneider
Josh -
ggplot doesn't work with the spatial classes of data.  try doing
orf.df=as.data.frame(orf)   first and then plotting orf.df.

Also, you have at least one typo - it should be geom_poin*t*
I always get confused with the qplot syntax so not sure if the rest if
right. I think ggplot()+ is much easier syntax.
ds




On Wed, Jun 10, 2015 at 9:35 AM, Joshua Onyango via R-sig-Geo 
r-sig-geo@r-project.org wrote:

 Hello
 I am trying to plot some box plots in ggplot2 for disease cases against a
 risk factor where this should appear in 8 coded regions. However when I try
 run this in R I keep getting this error message: ggplot2 doesn't know how
 to deal with data of class Spatial Points Data Frame.

  Here is the script:
 qplot(factor(Region.Coding), Cases2012, data=orf, geom_poin=c(boxplot,
 jitter),  main=2012 Cases by Nettles and
 Regions,fill=factor(Nettles),  xlab=Regions, ylab=Number of cases)
 Any help will be highly appreciated
 Thanks
 Josh

 [[alternative HTML version deleted]]

 ___
 R-sig-Geo mailing list
 R-sig-Geo@r-project.org
 https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] inconsistent as.data.frame(SpatialPointsDF)

2015-03-20 Thread Dominik Schneider
Thanks for the update.
FWIW, I actually liked the behavior of changing coordinate names to x and y
when converting out of geographic coordinates to projected coordinates. and
vice versa would be convenient but I will make a habit of using
coordinates() and coordnames() instead.
ds


Dominik Schneider
o 303.735.6296 | c 518.956.3978


On Fri, Mar 20, 2015 at 1:39 PM, Robert J. Hijmans r.hijm...@gmail.com
wrote:

 The raster package 'enhances' as.data.frame for SpatialLines* and
 SpatialPolygons*. I am not sure why there is also a function for
 SpatialPoints* as it does not add anything, except for setting the
 names to 'x' and 'y'. Perhaps that is what I did it for, I cannot
 remember. But clearly it is not desirable to get different results
 depending on when raster is loaded or not, so I have removed it. Sorry
 for causing trouble.
 Robert

 On Fri, Mar 20, 2015 at 11:39 AM, Frede Aakmann Tøgersen
 fr...@vestas.com wrote:
  Hi
 
  I can reproduce this using Edzer's example. See below for output from R.
 Also notice the naming of the coordinate columns to x and y in the S4
 method of the raster package's function as.data.frame! Isn't that quite
 unusual? I have maintainer on CC.
 
  library(sp)
  df = data.frame(Longitude=1:2, Latitude=2:1, z = 3:4)
  df1 = df
  coordinates(df1) = ~Longitude + Latitude
  as.data.frame(df1)
Longitude Latitude z
  1 12 3
  2 21 4
  library(raster)
  search()
   [1] .GlobalEnvpackage:rasterpackage:sp
   [4] ESSR  package:stats package:graphics
   [7] package:grDevices package:utils package:datasets
  [10] package:methods   Autoloads package:base
  as.data.frame(df1)
x y z
  1 1 2 3
  2 2 1 4
  showMethods(as.data.frame)
  Function: as.data.frame (package base)
  x=ANY
  x=data.frame
  (inherited from: x=ANY)
  x=Raster
  x=SpatialLines
  x=SpatialPoints
  x=SpatialPointsDataFrame
  (inherited from: x=SpatialPoints)
  x=SpatialPolygons
 
  selectMethod(as.data.frame, SpatialPointsDataFrame)
  Method Definition:
 
  function (x, row.names = NULL, optional = FALSE, ...)
  {
  .local - function (x, row.names = NULL, optional = FALSE,
  xy = TRUE, ...)
  {
  if (!xy) {
  if (.hasSlot(x, data)) {
  return(x@data)
  }
  else {
  return(NULL)
  }
  }
  nobj - length(x)
  d - coordinates(x)
  if (.hasSlot(x, data)) {
  d - cbind(d, x@data)
  }
  colnames(d)[1:2] - c(x, y)
  rownames(d) - NULL
  as.data.frame(d, row.names = row.names, optional = optional,
  ...)
  }
  .local(x, row.names, optional, ...)
  }
  bytecode: 0x0d037260
  environment: namespace:raster
 
  Signatures:
  x
  target  SpatialPointsDataFrame
  defined SpatialPoints
 
 
 
 
  Yours sincerely / Med venlig hilsen
 
 
  Frede Aakmann Tøgersen
  Specialist, M.Sc., Ph.D.
  Plant Performance  Modeling
 
  Technology  Service Solutions
  T +45 9730 5135
  M +45 2547 6050
  fr...@vestas.com
  http://www.vestas.com
 
  Company reg. name: Vestas Wind Systems A/S
  This e-mail is subject to our e-mail disclaimer statement.
  Please refer to www.vestas.com/legal/notice
  If you have received this e-mail in error please contact the sender.
 
  -Original Message-
  From: R-sig-Geo [mailto:r-sig-geo-boun...@r-project.org] On Behalf Of
  dschneiderch
  Sent: 20. marts 2015 18:19
  To: r-sig-geo@r-project.org
  Subject: Re: [R-sig-Geo] inconsistent as.data.frame(SpatialPointsDF)
 
  I just went through my packages, iteratively unloading each one with:
  detach('package:raster')
  and found that if I unload the raster package then the expected
 behavior of
  keeping the existing coordnames is achieved.
 
  now why that differs in my script I have no idea.
 
   coordnames(snotellocs)
  [1] Longitude Latitude
   detach('package:raster')
   head(as.data.frame(snotellocs))
network State Station_ID Site_ID   Site_Name Latitude Longitude
  1SNTLAZ 11R06S 308 BAKER BUTTE 34.45660 -111.4064
  2SNTLAZ 11R07S1140 BAKER BUTTE SMT 34.45547 -111.3827
  3SNTLAZ 09S01S 310   BALDY 33.97883 -109.5034
  4SNTLAZ 09S06S 902 BEAVER HEAD 33.69144 -109.2166
  5SNTLAZ 09N05S1143   BEAVER SPRING 36.32678 -109.0571
  6SNTLAZ 12P01S1139   CHALENDER 35.26247 -112.0623
Elevation_ft Elevation_m
  1 73002225
  2 77002347
  3 91252781
  4 79902435
  5 92002804
  6 71002164
  
 
  Dominik Schneider
  o 303.735.6296 | c 518.956.3978
 
 
  On Fri, Mar 20, 2015 at 11:02 AM, dschneiderch [via R-sig-geo] 
  ml-node+s2731867n758792...@n2.nabble.com wrote:
 
   Edzer - Look like we posted at the same time.
   In my example my coordinates

[R-sig-Geo] stack many files without loading into memory

2015-02-04 Thread Dominik Schneider
Hi -
I have some data on a server but would like to bring them local in a
somewhat compressed format that is still easy to access.

/Volumes/hD/2012 - 100 geotiffs
~/project/data/ - store those geotiffs here without needing server access.

untested, I think I could do something like:
s=stack()
writeRaster(s,'2012stack')
fn=list.files('/Volumes/hD/2012',pattern='*.tif',full.names=T)
lapply(fn,function(f){
s=stack('2012stack')
r=raster(f)
names(r)=gsub(pattern='.tif',replacement='',basename(f))
s=addLayer(s,r)
writeRaster(s,'2012stack')
})
Or is it better to save to a .RData?
Is there a better way that doesn't require me to loop through each geotiff
since I can't load it all into memory.
Thanks
Dominik

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] stack many files without loading into memory

2015-02-04 Thread Dominik Schneider
Wouldn't that keep the link to the server on which they are stored now?

Dominik Schneider
o 303.735.6296 | c 518.956.3978


On Wed, Feb 4, 2015 at 12:50 PM, Michael Sumner mdsum...@gmail.com wrote:

 Why not stack(fn)

 ?

 On Thu, 5 Feb 2015 06:41 Dominik Schneider dominik.schnei...@colorado.edu
 wrote:

 Hi -
 I have some data on a server but would like to bring them local in a
 somewhat compressed format that is still easy to access.

 /Volumes/hD/2012 - 100 geotiffs
 ~/project/data/ - store those geotiffs here without needing server
 access.

 untested, I think I could do something like:
 s=stack()
 writeRaster(s,'2012stack')
 fn=list.files('/Volumes/hD/2012',pattern='*.tif',full.names=T)
 lapply(fn,function(f){
 s=stack('2012stack')
 r=raster(f)
 names(r)=gsub(pattern='.tif',replacement='',basename(f))
 s=addLayer(s,r)
 writeRaster(s,'2012stack')
 })
 Or is it better to save to a .RData?
 Is there a better way that doesn't require me to loop through each geotiff
 since I can't load it all into memory.
 Thanks
 Dominik

 [[alternative HTML version deleted]]

 ___
 R-sig-Geo mailing list
 R-sig-Geo@r-project.org
 https://stat.ethz.ch/mailman/listinfo/r-sig-geo



[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] extracting raster data with point and polygons

2014-07-22 Thread Dominik Schneider
Brilliant! I don't really need mask() at the end actually and instead use
extract(zoneVariation,obs) to populate a data.frame.  Could you clarify
where you would use mask() to make this more efficient for larger rasters?
It seems this will always use all the cellnumbers. I though maybe
mask(zones,obs) but then zonal() threw an error.
Thanks!
Dominik

Dominik Schneider
o 303.735.6296 | c 518.956.3978



On Tue, Jul 22, 2014 at 5:17 AM, Oscar Perpiñan oscar.perpi...@upm.es
wrote:

 Hello,

 If I understood you right, I think you can solve it using `zonal` and
 `mask`. This is my proposal:

 library(raster)

 ## Your code
 #define DEM

 dem2m=raster(x=matrix(runif(22500,min=2000,max=2500),nrow=150),crs='+proj=utm
 +zone=13 +datum=NAD83')
 extent(dem2m)=c(0,300,0,3000)
 #
 #define satellite image
 fsca.dem=raster(x=matrix(runif(100,0,100),nrow=10),crs='+proj=utm
 +zone=13 +datum=NAD83')
 extent(fsca.dem)=c(0,300,0,3000)

 #define point data
 obs=data.frame(x=runif(5,0,300),y=runif(5,0,300),sd=runif(5,0,5))
 coordinates(obs)=~x+y
 proj4string(obs)='+proj=utm +zone=13 +datum=NAD83'

 ## My proposal
 ## Create a Raster with the same resolution as `dem2m`, whose values are
 ## the cell numbers of `fsca.dem`
 xyDEM2m - xyFromCell(dem2m, cell = seq_len(ncell(dem2m)))
 zones - setValues(dem2m, cellFromXY(fsca.dem, xyDEM2m))
 ## Compute the cv for each zone. `zoneVariation` has the same
 ## resolution as `fsca.dem`
 zoneVariation - setValues(fsca.dem, zonal(dem2m, zones, fun = cv)[,2])
 ## We are only interested in those cells with observations
 varObs - mask(zoneVariation, obs)

 If you work with large rasters maybe it is better to apply mask before.

 Best,

 Oscar.
 -
 Oscar Perpiñán Lamigueiro
 Dpto. Ingeniería Eléctrica (ETSIDI-UPM)
 Grupo de Sistemas Fotovoltaicos (IES-UPM)
 URL: http://oscarperpinan.github.io
 Twitter: @oscarperpinan


 2014-07-22 2:26 GMT+02:00 Dominik Schneider 
 dominik.schnei...@colorado.edu:
  Hi -
  I have a 2 meter DEM, a 30m satellite image and some point data.
 Basically,
  I would like to extract the 2m dem pixels that correspond with the 30m
  satellite data, but only for the satellite pixels that correspond with a
  point observation. i've been trying to get at the answer from the sp
  vignette 'over' but am stuck.
  an example:
  #define DEM
 
 dem2m=raster(x=matrix(runif(22500,min=2000,max=2500),nrow=150),crs='+proj=utm
  +zone=13 +datum=NAD83')
  extent(dem2m)=c(0,300,0,3000)
  #
  #define satellite image
  fsca.dem=raster(x=matrix(runiff(100,0,100),nrow=10),crs='+proj=utm
 +zone=13
  +datum=NAD83')
  extent(dem2m)=c(0,300,0,3000)
  #
  #define point data
  obs=data.frame(x=runif(5,0,300),y=runif(5,0,300),sd=runif(5,0,5))
  coordinates(obs)=~x+y
  proj4string(obs)='+proj=utm +zone=13 +datum=NAD83'
  #
  #--- in my case, I resampled the sat image to the dem grid. but thats
  already taken care of in the example.
  ##dem.pe=projectExtent(dem2m,crs=projection(dem2m))
  ##res(dem.pe) - 30
  ##fsca.dem=projectRaster(fsca,dem.pe,method='ngb')
  #
  # This is where I am lost
  # --- I thought I'd use polygons to extract the dem data. The
 catch
  is, rather than extracting every polygon of data (which takes longer than
  I've been willing to wait given 2e6 pixels) I would like to subset this
  spatialpolygondf to the 5 polygons that have observational data. if the
  same polygon has multiple obs in it, it should be replicated.
  fsca.poly=rasterToPolygons(fsca.dem)
 
  #first i tried this but this extracts the fsca values. I need this too
  eventually but was hoping for a subsetted spatialpolygon.dataframe
  fsca.pts=over(ss.depth,fsca.poly)
 
  #second i tried this but the list has rownames that I can't interpret.
  (fsca cellnumbers?)
  fsca.pts.poly=over(ss.depth,fsca.poly,returnList=T)
  polynames=unlist(lapply(fsca.pts.poly,rownames))
 
  #but what i'd like to do is apply the cv function on the 225 2m elev
 pixels
  that correspond to the 30 fsca pixel for which i have an observation
 point.
  elev.cv=extract(dem2m,*y=fsca.poly.w.pts*,fun=cv)
 
  # as a note, the buffer option in extract would not work currently
 because
  the point data isn't necessarily in the middle dem pixel relative to the
  fsca pixel. so one solution might be to 1. subset the polygon dataframe
 2.
  use the cell center from the fsca pixel to extract from the dem raster
 with
  a buffer of 7 cells.
 
  anyway, i appreciate any help you can give, in particulary with
 subsetting
  the spatialpolygondf.
  Thanks
  Dominik
 
 
  sessionInfo()
  R version 3.0.2 (2013-09-25)
  Platform: x86_64-apple-darwin10.8.0 (64-bit)
 
  locale:
  [1] C
 
  attached base packages:
  [1] stats graphics  grDevices utils datasets  methods   base
 
  other attached packages:
  [1] maptools_0.8-30reshape2_1.2.2 plyr_1.8
  rgdal_0.8-16
  [5] ggplot2_0.9.3.1RColorBrewer_1.0-5 scales_0.2.3
  raster_2.2-31
  [9] sp_1.0-15
 
  loaded

[R-sig-Geo] too many raster files open?

2014-07-01 Thread Dominik Schneider
Hi - I'm trying to process a large number of raster files and it is causing
an error after 243 files. Below is a simplified code snippet and the error
message. I tried doing closeAllConnections() but it didn't help so I'm not
sure what else needs to be done. There should really only ever be 1 raster
read connection open, 1 write raster connection and 4 different readOGR
connections open.

*ERROR:*
Error in .local(.Object, ...) :
  TIFFOpen:/Users/dosc3612/Documents/wwa/boulder2002133.tif: Too many open
files

Error in .rasterObjectFromFile(x, band = band, objecttype = RasterLayer,
(from subset.modscag.R#21) :
  Cannot create a RasterLayer object from this file.

*CODE:*
domain=readOGR(pn,bn) #reading in a polygon shape file with which to crop
the big raster

proc=function(fn,bd,dname){
r=raster(fn)
projection(r)=CRS('+proj=longlat +datum=NAD83')
r - mask(r,water.mask,maskvalue=1)
r=crop(r,bd)
f=unlist(strsplit(fn,'/'))
f=f[length(f)]
print(paste0(outfile,dname,f))

writeRaster(r,filename=paste0(outfile,dname,f),format='GTiff',overwrite=T)\
}

ffns=list.files(path='.',pattern=glob2rx('*.tif'),full.names=T)
lapply(ffns,domain,'basin1')


*SESSIONINFO:* sessionInfo()
R version 3.0.3 (2014-03-06)
Platform: x86_64-apple-darwin10.8.0 (64-bit)

locale:
[1] C

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base

other attached packages:
[1] rgdal_0.8-11  raster_2.1-49 sp_1.0-13 plyr_1.8

loaded via a namespace (and not attached):
[1] compiler_3.0.3  grid_3.0.3  lattice_0.20-27 tools_3.0.3


*TRACEBACK:* traceback()
7: stop(Cannot create a RasterLayer object from this file.)
6: .rasterObjectFromFile(x, band = band, objecttype = RasterLayer,
   ...)
5: .local(x, ...)
4: raster(fn)
3: raster(fn) at subset.modscag.R#14
2:
FUN(c(/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002014.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002021.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002032.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002035.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002037.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002044.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002046.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002062.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002090.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002092.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002094.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002103.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002107.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002112.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002114.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002119.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002121.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002126.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002129.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002133.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002137.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002150.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002151.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002156.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002160.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002163.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002165.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002167.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002174.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002179.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002181.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002183.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002190.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002213.tif,


/Volumes/hydroData/WestUS_Data/UCO_FSCA//2002_finished_20140423_1538/2002226.tif
   )[[1L]], ...)
1: lapply(ffns, proc, domain_bld, boulder)

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org

[R-sig-Geo] cressman analysis

2014-06-13 Thread Dominik Schneider
Are there any implementations in R of the Cressman interpolation?
https://nsidc.org/data/docs/daac/nsidc0033_arctic_water_vapor/cressman_equations.html
I haven't found anything via rseek or google.
Thanks Dominik

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] coordinate system in ncdf4

2013-09-03 Thread Dominik Schneider
Thanks for the tip about GDAL. I was checking the outfile with gdalinfo 1.9
and couldn't figure out why it wouldn't show the coordinate system.

I'll check out raster as an alternative to ncdf4.

Dominik Schneider
o 303.735.6296 | c 518.956.3978



On Fri, Aug 30, 2013 at 10:54 PM, Michael Sumner mdsum...@gmail.com wrote:

 It is possible, and one way to see an example would be to look in the
 raster package source.

 But, whether it will help depends on what the target software is.
 (GDAL 1.9.2 doesn't seem to understand the metadata written by raster
 for example.)

 On Sat, Aug 31, 2013 at 4:52 AM, Dominik Schneider
 dominik.schnei...@colorado.edu wrote:
  Is it possible to specify the coordinate system when creating a netcdf
 file
  with the ncdf4 package? I have not been able to see anything in the
  documentation.
  Thanks
 
  [[alternative HTML version deleted]]
 
  ___
  R-sig-Geo mailing list
  R-sig-Geo@r-project.org
  https://stat.ethz.ch/mailman/listinfo/r-sig-geo



 --
 Michael Sumner
 Hobart, Australia
 e-mail: mdsum...@gmail.com


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[R-sig-Geo] coordinate system in ncdf4

2013-08-30 Thread Dominik Schneider
Is it possible to specify the coordinate system when creating a netcdf file
with the ncdf4 package? I have not been able to see anything in the
documentation.
Thanks

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[R-sig-Geo] calculate moving window MAE for point and raster data.

2013-08-26 Thread Dominik Schneider
I would like some help with programming moving window comparison of spatial
point data vs raster data. To account for georeferencing errors in the
remote sensing data r I'd like to calculate error statistics such as MAE in
250 m increments given a 500m product. So in the example below, a 1km x 1km
grid, I would like to the calculate MAE for the 9 different possible pixel
locations given a 250m shift (4 corners + the 5 locations corresponding to
the 'center plus sign')

require(raster)require(sp)
x=runif(100,0,1000)
y=runif(100,0,1000)
z=runif(100)
df=data.frame(x,y,z)
coordinates(df)-~x+y
projection(df)-'+proj=utm +zone=13 +datum=WGS84'

r=raster(matrix(runif(4),nrow=2))
extent(r)=c(0,1000,0,1000)
projection(r)-'+proj=utm +zone=13 +datum=WGS84'

I was looking at focal in the raster package but that is for a window on
raster data. Googling also brought up caTools and runmean in particular but
I wasn't sure how to make that work since it works on vectors and isn't
inherently spatial. Thanks in advance


P.S. This is cross posted at
http://stackoverflow.com/questions/18424103/calculate-moving-window-mae-for-point-and-raster-databut
haven't gotten any responses there.

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] optimize idw power

2013-08-18 Thread Dominik Schneider
Jon -
Thank you for your response. I have 200 observations and 5 million points
onto which I would like to interpolate.  I've looked at the variograms for
each of my 39 timesteps and the range is consistently between 100 and 200
km (domain dimensions are 800x1200 km).
I originally wanted to use kriging because it takes into account clustering
of observations but found kriging to take too long computationally. I
settled on IDW for this reason. Precision of IDW I was hoping to get to 0.1
(similar research as found 0.5-1.3 to be an optimal power). I also found
very little difference in my results after applying idw (idp=2) and kr.
It's good, I think, to keep in mind that I'm interpolating residuals from a
multiple linear regression so I think that the overall spatial pattern has
already been established - this could be a reason for the small difference
between idw and kr. The patterns are quite different when plotted (idw
surface vs kr surface) but I just haven't seen a big difference compared
with ground measurements.

I will work on this some more and try idw/kr with a 100km radius and if it
speeds up enough I'll give the optimization a try

Dominik Schneider
o 303.735.6296 | c 518.956.3978



On Fri, Aug 16, 2013 at 7:22 AM, Jon Olav Skoien 
jon.sko...@jrc.ec.europa.eu wrote:

  Hi,

 Do you have 5 million observations or do you want to interpolate to 5
 million points? The default kriging approach could probably be a lot faster
 if you compute the variogram based on a subset of the points if the number
 of observations is high or if you use a local kriging neighbourhood if the
 number of prediction locations is high.
 The question is also with what precision you want to find idp. Any
 optimization procedure will need a few iterations to find it, similar to
 the brute force approach. If you would rather go back to your optimization
 attempts, you should use krige.cv rather than idw. Slightly modified from
 estimateParameters for inverse distance in intamap, you probably want to
 minimize:

 sum(krige.cv(formula, observations, nmax = nmax, set = list(idp =
 idp))$residual^2)
 where observations is a Spatial*DataFrame with observations.

 You can also have a look at the following link suggesting alternative
 methods for one-variable optimization:
 http://www.r-bloggers.com/single-variable-optimization/

 Cheers,
 Jon


 On 13-Aug-13 18:59, Dominik Schneider wrote:

 Hi - Thanks for the tips on automap and intamap. I've compared kriging and
 idw and see very little benefit from kriging and kriging takes about 3x as
 long to run (15min vs 5 min or so) so I've decided to stick with IDW. I
 have a large domain with over 5million points so are there any methods
 implemented that are not brute force?

  Dominik Schneider
  o 303.735.6296 | c 518.956.3978



 On Tue, Aug 13, 2013 at 7:38 AM, Jon Olav Skoien 
 jon.sko...@jrc.ec.europa.eu wrote:

 Dominik,

 The brute force approach shown in the link from Tom has in the mean time
 been implemented in the intamap package. See the example for the function
 estimateParameters.

 Cheers,
 Jon


 On 13-Aug-13 1:54, Thomas Adams wrote:

 Dominik,

 Look here:

 http://r-sig-geo.2731867.n2.nabble.com/GSTAT-Optimize-power-value-for-IDW-td2765918.html

 Tom


 On Mon, Aug 12, 2013 at 5:17 PM, Dominik Schneider 
 dominik.schnei...@colorado.edu wrote:

  Hi,
 I'd like to optimize the IDW power, similar to what ArcGIS does. I found
 this old thread
 http://www.mail-archive.com/r-sig-geo@stat.math.ethz.ch/msg04011.htmlbut
 was hoping someone could help adapt the solution. I am using



 idw(residuals(mdl.stepaic[[j]])~1,locations=swe.nozeros[swe.nozeros$mth==mth
  swe.nozeros$yr==yr,],newdata=swe.grid)

 where mdl.stepaic[[j]] is a list of 39 regression equations.
 swe.nozeros is
 a spatialpointsdataframe of observations. swe.grid is a grid where I
 want
 new values predicted. I currently run this idw command through a loop
 of j
 with no problems but I'd like to optimize both within each equation
 represented by j and for the whole data set (so across the 39
 timesteps).

 Below is what I'm starting with but I don't really understand how to
 get it
 working. I appreciate your help.

 f= function(idp,formula,locations,newdata,...) {


 sum(idw(formula,locations,newdata),set=list(debug=0,idp=idp),...)$residuals**2
 }


 optimize(f,interval=seq(1.4,1.7,by=0.1),formula=residuals(mdl.stepaic[[j]])~1,locations=swe.nozeros[swe.nozeros$mth==mth
  swe.nozeros$yr==yr,],newdata=swe.grid)

  [[alternative HTML version deleted]]

 ___
 R-sig-Geo mailing list
 R-sig-Geo@r-project.org
 https://stat.ethz.ch/mailman/listinfo/r-sig-geo

  [[alternative HTML version deleted]]

 ___
 R-sig-Geo mailing list
 R-sig-Geo@r-project.org
 https://stat.ethz.ch/mailman/listinfo/r-sig-geo



   --
 Jon Olav Skøien
 Joint Research Centre - European Commission
 Institute for Environment and Sustainability

Re: [R-sig-Geo] optimize idw power

2013-08-13 Thread Dominik Schneider
Hi - Thanks for the tips on automap and intamap. I've compared kriging and
idw and see very little benefit from kriging and kriging takes about 3x as
long to run (15min vs 5 min or so) so I've decided to stick with IDW. I
have a large domain with over 5million points so are there any methods
implemented that are not brute force?

Dominik Schneider
o 303.735.6296 | c 518.956.3978



On Tue, Aug 13, 2013 at 7:38 AM, Jon Olav Skoien 
jon.sko...@jrc.ec.europa.eu wrote:

 Dominik,

 The brute force approach shown in the link from Tom has in the mean time
 been implemented in the intamap package. See the example for the function
 estimateParameters.

 Cheers,
 Jon


 On 13-Aug-13 1:54, Thomas Adams wrote:

 Dominik,

 Look here:
 http://r-sig-geo.2731867.n2.**nabble.com/GSTAT-Optimize-**
 power-value-for-IDW-td2765918.**htmlhttp://r-sig-geo.2731867.n2.nabble.com/GSTAT-Optimize-power-value-for-IDW-td2765918.html

 Tom


 On Mon, Aug 12, 2013 at 5:17 PM, Dominik Schneider 
 dominik.schnei...@colorado.edu** wrote:

  Hi,
 I'd like to optimize the IDW power, similar to what ArcGIS does. I found
 this old thread
 http://www.mail-archive.com/r-**sig-...@stat.math.ethz.ch/**
 msg04011.htmlhttp://www.mail-archive.com/r-sig-geo@stat.math.ethz.ch/msg04011.htmlbut
 was hoping someone could help adapt the solution. I am using


 idw(residuals(mdl.stepaic[[j]]**)~1,locations=swe.nozeros[swe.**
 nozeros$mth==mth
  swe.nozeros$yr==yr,],newdata=**swe.grid)

 where mdl.stepaic[[j]] is a list of 39 regression equations. swe.nozeros
 is
 a spatialpointsdataframe of observations. swe.grid is a grid where I want
 new values predicted. I currently run this idw command through a loop of
 j
 with no problems but I'd like to optimize both within each equation
 represented by j and for the whole data set (so across the 39 timesteps).

 Below is what I'm starting with but I don't really understand how to get
 it
 working. I appreciate your help.

 f= function(idp,formula,**locations,newdata,...) {

 sum(idw(formula,locations,**newdata),set=list(debug=0,idp=**
 idp),...)$residuals**2
 }

 optimize(f,interval=seq(1.4,1.**7,by=0.1),formula=residuals(**
 mdl.stepaic[[j]])~1,locations=**swe.nozeros[swe.nozeros$mth==**mth
  swe.nozeros$yr==yr,],newdata=**swe.grid)

  [[alternative HTML version deleted]]

 __**_
 R-sig-Geo mailing list
 R-sig-Geo@r-project.org
 https://stat.ethz.ch/mailman/**listinfo/r-sig-geohttps://stat.ethz.ch/mailman/listinfo/r-sig-geo

  [[alternative HTML version deleted]]

 __**_
 R-sig-Geo mailing list
 R-sig-Geo@r-project.org
 https://stat.ethz.ch/mailman/**listinfo/r-sig-geohttps://stat.ethz.ch/mailman/listinfo/r-sig-geo



 --
 Jon Olav Skøien
 Joint Research Centre - European Commission
 Institute for Environment and Sustainability (IES)
 Land Resource Management Unit

 Via Fermi 2749, TP 440,  I-21027 Ispra (VA), ITALY

 jon.sko...@jrc.ec.europa.eu
 Tel:  +39 0332 789206

 Disclaimer: Views expressed in this email are those of the individual and
 do not necessarily represent official views of the European Commission.




[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[R-sig-Geo] optimize idw power

2013-08-12 Thread Dominik Schneider
Hi,
I'd like to optimize the IDW power, similar to what ArcGIS does. I found
this old thread
http://www.mail-archive.com/r-sig-geo@stat.math.ethz.ch/msg04011.html but
was hoping someone could help adapt the solution. I am using

idw(residuals(mdl.stepaic[[j]])~1,locations=swe.nozeros[swe.nozeros$mth==mth
 swe.nozeros$yr==yr,],newdata=swe.grid)

where mdl.stepaic[[j]] is a list of 39 regression equations. swe.nozeros is
a spatialpointsdataframe of observations. swe.grid is a grid where I want
new values predicted. I currently run this idw command through a loop of j
with no problems but I'd like to optimize both within each equation
represented by j and for the whole data set (so across the 39 timesteps).

Below is what I'm starting with but I don't really understand how to get it
working. I appreciate your help.

f= function(idp,formula,locations,newdata,...) {
sum(idw(formula,locations,newdata),set=list(debug=0,idp=idp),...)$residuals**2
}
optimize(f,interval=seq(1.4,1.7,by=0.1),formula=residuals(mdl.stepaic[[j]])~1,locations=swe.nozeros[swe.nozeros$mth==mth
 swe.nozeros$yr==yr,],newdata=swe.grid)

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[R-sig-Geo] Trouble with universal kriging

2013-06-20 Thread Dominik Schneider
Hi, I am trying to follow the examples in  Bivand et al 'Applied Spatial
Data Analysis' and the examples at
http://casoilresource.lawr.ucdavis.edu/drupal/node/442 but I am frustrated.
I finally managed an ordinary kriging example to work but with little
understanding of why/when it should work.
I posted some data: https://www.dropbox.com/s/rtg7gh67ziai3ec/mar2011.RData
Another item I've noticed, if I  define the projection with
projection(object)='+proj=lonlat +datum=NAD83'  for any of my spatial
objects it won't work.  It only works with an defined projection. any help
would be appreciated. Thanks.

load('mar2011.RData')

vt=variogram(snotel~recon,data=mar2011)

vt.fit=fit.variogram(vt,vgm(psill=0.025,nugget=0.015,model='Sph',range=2))

#plot(vt,vt.fit)

ok-krige(snotel~1,locations=mar2011,newdata=swe.grid,model=vt.fit)

uk-krige(snotel~mar2011$recon,locations=mar2011,newdata=swe.grid,model=
vt.fit)

#spplot(ok)
Dominik

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo