Re: [R-sig-Geo] How to create inward (shrinking) buffer zones with st_buffer()

2024-02-19 Thread Bede-Fazekas Ákos

Dear Xiang,

sure it is possible. I suggest using a CRS that is projected, e.g. 3857 
(Web Mercator), or turn off s2 geometry. If you turn off s2, then 
st_buffer() will interpret the dist parameter in degrees. Anyway, if you 
choose a small enough number, then you'll get a non-empty geometry.


> sf_use_s2(FALSE)
Spherical geometry (s2) switched off
> st_buffer(a, -100)
dist is assumed to be in decimal degrees (arc_degrees).
Geometry set for 1 feature  (with 1 geometry empty)
Geometry type: POLYGON
Dimension: XY
Bounding box:  xmin: NA ymin: NA xmax: NA ymax: NA
Geodetic CRS:  WGS 84
POLYGON EMPTY
Warning message:
In st_buffer.sfc(a, -100) :
  st_buffer does not correctly buffer longitude/latitude data
> st_buffer(a, -0.001)
dist is assumed to be in decimal degrees (arc_degrees).
Geometry set for 1 feature
Geometry type: POLYGON
Dimension: XY
Bounding box:  xmin: -2.533411 ymin: 51.40595 xmax: -2.489763 ymax: 51.4315
Geodetic CRS:  WGS 84
POLYGON ((-2.533083 51.41747, -2.528692 51.4204...
Warning message:
In st_buffer.sfc(a, -0.001) :
  st_buffer does not correctly buffer longitude/latitude data

(I don't know why s2 can't deal with negative distance if the CRS is 
WGS-84.)


HTH,
Ákos

Ákos Bede-Fazekas
Centre for Ecological Research, Hungary

2024. 02. 19. 11:04 keltezéssel, Xiang Ye via R-sig-Geo írta:

Dear community,

I am learning some basic geometry operation functions of sf package including 
st_buffer().

It seems there should be no wonder if I provide a negative value to the dist 
argument in st_buffer(), I should expect an inward/shrinking buffer zone (I 
also followed here: 
https://gis.stackexchange.com/questions/392505/can-i-use-r-to-do-a-buffer-inside-polygons-shrink-polygons-negative-buffer).
 However, it turn out to be as long as I provide a negative value, the output 
will be an empty geometry:

library(sf)
library(spDataLarge)
st_geometry(bristol_zones[1, ]) -> a# a is the exemplary data set


a

Geometry set for 1 feature
Geometry type: MULTIPOLYGON
Dimension: XY
Bounding box:  xmin: -2.534502 ymin: 51.40487 xmax: -2.488435 ymax: 51.43478
Geodetic CRS:  WGS 84
MULTIPOLYGON (((-2.510462 51.42878, -2.507985 5...

st_buffer(a, 100)

Geometry set for 1 feature
Geometry type: POLYGON
Dimension: XY
Bounding box:  xmin: -2.536248 ymin: 51.40393 xmax: -2.486907 ymax: 51.43598
Geodetic CRS:  WGS 84
POLYGON ((-2.517834 51.43188, -2.518218 51.4318...

st_buffer(a, -100)

Geometry set for 1 feature  (with 1 geometry empty)
Geometry type: POLYGON
Dimension: XY
Bounding box:  xmin: NA ymin: NA xmax: NA ymax: NA
Geodetic CRS:  WGS 84
POLYGON EMPTY

So I would like to know if it is possible to create inward buffer zones with 
st_buffer()? If st_buffer() is not designed to perform this, what is the best 
alternative?

Thank you, and have a great start of the week!

$B3pfF(B YE, Xiang
THINKING SPATIALLY.
Ph.D. in Spatial Statistics

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



--
Ezt az e-mailt átvizsgálta az Avast AntiVirus szoftver.
www.avast.com

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Plotting probability exceedance

2023-02-15 Thread Bede-Fazekas Ákos
Hello,
As far as I understand, for one point (i.e., one row of the 
sf/data.frame) you have several precipitation values, but can calculate 
one exceedance probability from these values. So an xy-plot is not the 
best choice for visualization of this kind of data. Or you'll have the 
same y value (probability) for several x values (precipitation). If this 
is not a problem for you, then I suggest using tidyr::pivot_longer() to 
transfrom the precipitation values from the several columns to one. Then 
you can draw the xy-plot by ggplot or base plot or even lattice plot.
HTH,
Ákos
__
Ákos Bede-Fazekas
Centre for Ecological Research, Hungary

2023.02.16. 2:52 keltezéssel, rain1...@aim.com írta:
> Hi Akos,
>
> Thank you so much for this suggestion! Indeed, I have 25 data points 
> in each column, and yes, the data are normally distributed. Using the 
> pnorm function is actually quite useful, as well. To that end, could 
> the values of exceedance probability from pnorm be somehow plotted 
> against their associated precipitation thresholds on an xy plot, for 
> example?
>
> Another idea that came to mind is the use of Probability Density 
> Functions, but can these really be used to graphically show exceedance 
> probabilities?
>
> Thanks, again!
>
> -Original Message-
> From: Bede-Fazekas Ákos 
> To: r-sig-geo@r-project.org
> Sent: Wed, Feb 15, 2023 1:59 am
> Subject: Re: [R-sig-Geo] Plotting probability exceedance
>
> Hello,
>
> You should know or make assumptions on the distribution of the
> precipitation. Let's say it is normally distributed (i.e. bell-shaped).
> Then you can calculate the probability of exceeding the quantile /q/ by
> pnorm(q, mean, sd, lower.tail = FALSE).
> If you have several spatial points and a lot of measurments (stored in
> columns of the sf/data.frame) for each of the points, then use
> apply(X, MARGIN = 1, FUN = function(measurements) {return(pnorm(q, mean,
> sd, lower.tail = FALSE))})
> and you can display the probabilities in a map.
>
> HTH,
> Ákos
> __
> Ákos Bede-Fazekas
> Centre for Ecological Research, Hungary
>
> 2023.02.15. 1:28 keltezéssel, rain1290--- via R-sig-Geo írta:
> > Hi there,
> > I have climate data pertaining to extreme precipitation, as well as 
> carbon emissions associated with those precipitation values in a 
> dataframe.
> > The goal of my analysis would be to determine the probability of 
> exceeding specific thresholds of precipitation extremes, as well as 
> showing this graphically (I am imagining this by placing extreme 
> precipitation on the the x-axis and exceedance probabilities on the 
> y-axis).
> > My question is if anyone has an idea how to approach this, or a good 
> starting place? I have looked online, but there is nothing specific to 
> really draw on.
> > Thank you for your time, and I look forward to your response!
> >     [[alternative HTML version deleted]]
> >
> > ___
> > R-sig-Geo mailing list
> > R-sig-Geo@r-project.org
> > https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>
>     [[alternative HTML version deleted]]
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Plotting probability exceedance

2023-02-14 Thread Bede-Fazekas Ákos
Hello,

You should know or make assumptions on the distribution of the 
precipitation. Let's say it is normally distributed (i.e. bell-shaped). 
Then you can calculate the probability of exceeding the quantile /q/ by
pnorm(q, mean, sd, lower.tail = FALSE).
If you have several spatial points and a lot of measurments (stored in 
columns of the sf/data.frame) for each of the points, then use
apply(X, MARGIN = 1, FUN = function(measurements) {return(pnorm(q, mean, 
sd, lower.tail = FALSE))})
and you can display the probabilities in a map.

HTH,
Ákos
__
Ákos Bede-Fazekas
Centre for Ecological Research, Hungary

2023.02.15. 1:28 keltezéssel, rain1290--- via R-sig-Geo írta:
> Hi there,
> I have climate data pertaining to extreme precipitation, as well as carbon 
> emissions associated with those precipitation values in a dataframe.
> The goal of my analysis would be to determine the probability of exceeding 
> specific thresholds of precipitation extremes, as well as showing this 
> graphically (I am imagining this by placing extreme precipitation on the the 
> x-axis and exceedance probabilities on the y-axis).
> My question is if anyone has an idea how to approach this, or a good starting 
> place? I have looked online, but there is nothing specific to really draw on.
> Thank you for your time, and I look forward to your response!
>   [[alternative HTML version deleted]]
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] st_intersection produce Geometry type: GEOMETRY instead of Geometry type: POLYGON

2023-02-10 Thread Bede-Fazekas Ákos

Dear Manuel,

technically, the result of st_intersection(x, y), where both x and y are 
POLYGONs, can be POINT, LINESTRING, POLGYON and GEOMETRY as well. The 
result is GEOMETRY if the type of the different features is not the same 
(e.g. POLYGON+POINT).

You can subset the result in this way:
g_25000_c_polygons_only <- g_25000_c[st_is(x = g_25000_c, type = "POLYGON")]

HTH,
Ákos
___
Ákos Bede-Fazekas
Centre for Ecological Research, Hungary

2023.02.10. 18:32 keltezéssel, Manuel Spínola írta:

Dear list members,

I am trying to "crop" a polygon (grid) with a polygon, but the result is an
sf object Geometry type: GEOMETRY, instead of an sf object Geometry type:
POLYGON.

How can I obtain an sf POLYGON?

nc = st_read(system.file("shape/nc.shp", package="sf"))



nc <- st_transform(nc, 3857)



g_5 <- st_make_grid(nc, cellsize = 5) |> st_as_sf()



g_5 <- g_5[nc, ]



g_5_d <- st_union(g_5)



g_25000 = st_make_grid(g_5_d, cellsize = 25000) |> st_as_sf()



g_25000 # Geometry type: POLYGON



g_25000_c <- st_intersection(g_25000, g_5_d)



g_25000_c # Geometry type: GEOMETRY






___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] geodata package: CMIP6 climate model data downloading error

2023-01-25 Thread Bede-Fazekas Ákos
Dear Manuel,

for me this link 
(https://geodata.ucdavis.edu/cmip6/30s/ACCESS-CM2/ssp126/wc2.1_30s_bioc_ACCESS-CM2_ssp126_2021-2040.tif)
 
works perfectly in Firefox browser. It starts downloading the geoTiff file.
I think it should work with download.file() from R as well.

Alternatively, you can try package "easyclimate". I have no experience 
with this package, only read the scientific paper about this: 
https://www.sciencedirect.com/science/article/pii/S1364815223000130?dgcid=rss_sd_all
Please note that it is about daily climate instead of long-term averages.

HTH,
Ákos
_
Ákos Bede-Fazekas
Centre for Ecological Research, Hungary

2023.01.25. 15:48 keltezéssel, Manuel Spínola írta:
> Thank you very much Ákos.
>
> It didn't work either.
>
> Is there any other package that allows to download future climate 
> change scenarios?
>
> Manuel
>
>
>
> El mié, 25 ene 2023 a las 1:27, Bede-Fazekas Ákos 
> () escribió:
>
> Dear Manuel,
>
> There are some broken links in the WorldClim website. If you find
> one,
> you can write to i...@worldclim.org. Although I have negative
> experiences, you should try this way.
> Anyway, the global, non-tiled raster can be downloaded from this link:
> 
> https://geodata.ucdavis.edu/cmip6/30s/ACCESS-CM2/ssp126/wc2.1_30s_bioc_ACCESS-CM2_ssp126_2021-2040.tif
>
> Have a nice week,
> Ákos
>
> _
> Ákos Bede-Fazekas
> Centre for Ecological Research, Hungary
>
> 2023.01.24. 23:29 keltezéssel, Manuel Spínola írta:
> > Dear list members,
> >
> > I am trying to download CMIP6 climate model data using the R package
> > geodata but I got this error message:
> >
> > bio <- cmip6_tile(lon = -84, lat = 10,"ACCESS-CM2", "126",
> "2021-2040",
> > var="bioc", path = "climate_ckange")
> > trying URL '
> >
> 
> https://geodata.ucdavis.edu/cmip6/tiles/ACCESS-CM2/ssp126/wc2.1_30s_bioc_ACCESS-CM2_ssp126_2021-2040_tile-28.tif
> > '
> > Error in utils::download.file(url = url, destfile = filename,
> quiet =
> > quiet,  :
> >    cannot open URL '
> >
> 
> https://geodata.ucdavis.edu/cmip6/tiles/ACCESS-CM2/ssp126/wc2.1_30s_bioc_ACCESS-CM2_ssp126_2021-2040_tile-28.tif
> > '
> > download failed
> >
> > I cannot even download the data using the worldclim website.
> >
> >
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>
>
>
> -- 
> *Manuel Spínola, Ph.D.*
> Instituto Internacional en Conservación y Manejo de Vida Silvestre
> Universidad Nacional
> Apartado 1350-3000
> Heredia
> COSTA RICA
> mspin...@una.cr <mailto:mspin...@una.ac.cr>
> mspinol...@gmail.com
> Teléfono: (506) 8706 - 4662
> Institutional website: ICOMVIS 
> <http://www.icomvis.una.ac.cr/index.php/manuel>
> Blog sobre Ciencia de Datos: https://mspinola-ciencia-de-datos.netlify.app

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] geodata package: CMIP6 climate model data downloading error

2023-01-24 Thread Bede-Fazekas Ákos

Dear Manuel,

There are some broken links in the WorldClim website. If you find one, 
you can write to i...@worldclim.org. Although I have negative 
experiences, you should try this way.

Anyway, the global, non-tiled raster can be downloaded from this link:
https://geodata.ucdavis.edu/cmip6/30s/ACCESS-CM2/ssp126/wc2.1_30s_bioc_ACCESS-CM2_ssp126_2021-2040.tif

Have a nice week,
Ákos

_
Ákos Bede-Fazekas
Centre for Ecological Research, Hungary

2023.01.24. 23:29 keltezéssel, Manuel Spínola írta:

Dear list members,

I am trying to download CMIP6 climate model data using the R package
geodata but I got this error message:

bio <- cmip6_tile(lon = -84, lat = 10,"ACCESS-CM2", "126", "2021-2040",
var="bioc", path = "climate_ckange")
trying URL '
https://geodata.ucdavis.edu/cmip6/tiles/ACCESS-CM2/ssp126/wc2.1_30s_bioc_ACCESS-CM2_ssp126_2021-2040_tile-28.tif
'
Error in utils::download.file(url = url, destfile = filename, quiet =
quiet,  :
   cannot open URL '
https://geodata.ucdavis.edu/cmip6/tiles/ACCESS-CM2/ssp126/wc2.1_30s_bioc_ACCESS-CM2_ssp126_2021-2040_tile-28.tif
'
download failed

I cannot even download the data using the worldclim website.




___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Selecting a range of values in a specific column for R ggplot

2022-10-13 Thread Bede-Fazekas Ákos

Hello,
try
newplot <- ggplot(data = GEV[1:4, ], aes(x = RL45)) + geom_density(color 
= "midnightblue") + xlab("Location (mm/day") + ggtitle("Global Location 
under RCP4.5") + xlim(300, 350)

HTH,
Ákos
---
Ákos Bede-Fazekas
Centre for Ecological Research, Hungary

2022.10.13. 18:27 keltezéssel, rain1290--- via R-sig-Geo írta:

I am trying to select a range of values (i.e. the first 4 values) in a specific column 
from a table. The column is called "RL45". I tried the following code to create 
a plot in ggplot:
newplot <- ggplot(data = GEV, aes(x = RL45[1:4])) + geom_density(color = "midnightblue") + 
xlab("Location (mm/day") + ggtitle("Global Location under RCP4.5") + xlim(300, 350)

This results in this strange error:

Error: Aesthetics must be either length 1 or the same as the data (34): x

This is odd, as I selected the first 4 values in that "RL45" column.
Any thoughts would be greatly appreciated!
Thank you,
[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] terra::app

2022-09-29 Thread Bede-Fazekas Ákos

Dear Jonathan,

Function get.snow() is not vectorized, i.e. it does not return a vector 
of the same length as its input. According to the help of app(), "The 
function should return a vector or matrix that is divisible by ncell(x).".


HTH,
Ákos Bede-Fazekas
Centre for Ecological Research
Hungary

2022.09.29. 5:27 keltezéssel, Thayn, Jonathan írta:

I keep getting the following message when using terra:app –– Error: [app] the 
number of values returned by 'fun' is not appropriate. I can’t figure out what 
I’m doing wrong. A sample code is below:




library(terra)
the.raster <- rast(ncol=4,nrow=4)
the.raster[] <- sample(0:65535,16)

get.snow <- function(j){
k <- as.numeric(intToBits(j)[10:11])
m <- k%*%1:2
return(m)
}

new.raster <- app(the.raster,get.snow)


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] How to separate a set of geographical regions to multiple groups that are similar to each other but also maintains geographic contiguity?

2022-07-08 Thread Bede-Fazekas Ákos

Dear Danlin,

I think you need SKATER (Spatial 'K'luster Analysis by Tree Edge 
Removal) algorithm that is implemented in package 'spdep' (function 
skater()) and does spatially constrained clustering. The results of 
SKATER are contiguous regions formed by more or less similar neighboring 
polygons. It was published by Assuncao et al. (2006). Here you can find 
a tutorial:

https://geodacenter.github.io/tutorials/spatial_cluster/skater.html

HTH,
Ákos
_
Ákos Bede-Fazekas
Centre for Ecological Research, Hungary


2022.07.08. 6:05 keltezéssel, Danlin Yu írta:

Dear List members:


I have recently attempted to do a regionalization analysis with a group
of geographic regions, each contains multiple attributes (A1, A2, A3,
...). The goal is not like a regular regionalization problem (such as
K-means) in which you define groups with minimal within group
dissimilarity but maximal between group dissimilarity.


My regionalization is the opposite, I want the groups to be as similar
as possible (although within group does not have to be as dissimilar as
possible, but that is of less concern) in terms of means, variance, and
other statistics. I ran into the minDiff package and its successor
anticlust package in R, and it is able to do the job wonderfully except
for one problem: since this is a regionalization problem, I would really
want the final groups to be geographically connected (spatially
constrained). Results from minDiff/anticlust, however, show the
different groups are mixed with one another all over the map. Here is a
sample code:


A dataframe contains the geographic units and attributes is read from a
shapefile and stored in geo.df.


|geo.df<-as.data.frame(read_sf(dsn = getwd(), lay = "geolayer",
stringsAsFactors = FALSE)) geo.df$class <- anticlustering(geo.df[,
c("A1", "A2", "A3", "A4", ..., "An"), K = 5, objective = "variance",
standardize = TRUE) |

I've tried to include coordinates in the list of attributes (A1, A2,
..., An), pairwise distances, but none worked. I always ended up with
well separated groups, but all mixed with one another in the geographic
space.


Any pointers on how to proceed from here? Any hints will be greatly
appreciated.


Thank you all in advance.


Best,

Danlin Yu



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] leaflet in shiny error message

2022-01-03 Thread Bede-Fazekas Ákos

Dear Basile,

the %>% operator is accidentally commented out! Change
fillOpacity = 0.7)#%>%
to
fillOpacity = 0.7) %>%
and then setView() will get its first parameter called "map".

HTH,
Ákos Bede-Fazekas
Centre for Ecological Research, Hungary

2022.01.03. 1:49 keltezéssel, basile.lenormand via R-sig-Geo írta:

Hello everyone!

I am trying to create a shiny app with leaflet stuff in it. I have a problem 
when I am displaying the application, I got this error message :

"Error in dispatch: argument "map" is missing, with no default".

I searched some clues on the net, I found some examples which are working but I 
do not get where I going wrong. I double checked the brackets, try to change 
the name of the map, put the name of the spatial data frame that I am using, 
etc.

If someone can give me some clues I will be really glad.
Have a great day,
thank you for your time,
Basile.

Here is my code:

ui <- fluidPage(
titlePanel(""),

sidebarLayout(
sidebarPanel(
helpText(""),

selectInput("var",
label = "Choose a variable to display",
choices = spdf$koppen,
selected = ""
),
sliderInput("range", "surface brûlée en hectare", min(spdf$V11), max(spdf$V11),
value = range(df$V11), step = 2500
),
sliderInput("range", "Nombre d'incendie", min(spdf$n), max(spdf$n),
value = range(df$V11), step = 2
),
checkboxInput("legend", "Show legend", TRUE)),
mainPanel(leafletOutput("map"))
)
)

server <- function(input, output, session) {
output$map <- renderLeaflet({
leaflet(df) %>%
addTiles(group = "OSM") %>%
addProviderTiles("Esri.NatGeoWorldMap", group="ESRI") %>%
addProviderTiles("CartoDB.DarkMatter", group= "CartoDB") %>%
#fitBounds(~min(long), ~min(lat), ~max(long), ~max(lat))%>%
addPolygons(
data=spdf,
fillColor = "viridis",
weight = 0,
opacity = 1,
color = "white",
dashArray = "3",
fillOpacity = 0.7)#%>%
setView(lat= 44.2, lng=5.9, zoom=5)
})
}
shinyApp(ui, server)

Sent with [ProtonMail](https://protonmail.com) Secure Email.
[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] create projection for a .asc file in Rstudio

2021-12-06 Thread Bede-Fazekas Ákos
Dear Basile,
the PROJ4 string has a really strict specification. You cannot use
crs(rasterClimat)<-"proj=longlat +datum=WGS84 +no_defs 
+ellps=WGS84+towgs84= 0,0,0"
but can use
crs(rasterClimat)<-"+proj=longlat +datum=WGS84 +no_defs +ellps=WGS84 
+towgs84=0,0,0"
(see the differences in the plus sign and the spaces)
HTH,
Ákos Bede-Fazekas
Centre for Ecological Research, Hungary

2021.12.06. 17:53 keltezéssel, basile.lenormand via R-sig-Geo írta:
> Good evening every ones.
>
> I am trying to set a projection system for an ascii file in Rstudio.
>
> I reached to import the .asc on R and to plot it. Now I need to set a 
> projection.
>
> In my matrix I have CRS : NA
>
> I tried two ways :
>
> *First way:*
>
> /crs(rasterClimat, asText=F)/
> /crs(rasterClimat)<- "+proj=lcc +lat_1=48 +lat_2=33 +lon_0=-100 
> +datum=WSG84"//
> /
> /crs(rasterClimat)//
> /
> /w<- wkt(rasterClimat)//
> /
> /w//
> /
> /cat(w, "\n")/
>
> (taken from here :https://rdrr.io/cran/raster/man/projection.html)
>
> *Second way:*
>
> /crs(rasterClimat)<-"proj=longlat +datum=WGS84 +no_defs 
> +ellps=WGS84+towgs84= 0,0,0"/
>
> (taken from here : https://www.youtube.com/watch?v=iqbVyt8mjIk, 
> tutorial that I followed)
>
> For the two methods, the error message is the same :
>
> /Error in sp::CRS(SRS_string = x) : NA/
>
> I followed the youtube tutorial, our two cases seem similar but she 
> reached to project her map.
>
> May someone gives me some clues about what I do not understand in this 
> process and where I have to look for and/or what can I do to fix it?
> I want to do it on R, I do not want to use a GISsoftware to set the 
> coordinate system then import the file into R.
>
> I put in attachment the code I used if you want to have a close look, 
> and the .asc files.
>
> Thank you very much for your time,
> best regards,
>
> Basile.
>
>
>
>
>
>
> Sent with ProtonMail  Secure Email.
>
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] simple topography in r

2021-11-16 Thread Bede-Fazekas Ákos

Dear Ani,
Package 'raster' has a function called getData(), which can download 
SRTM tiles. Then you can simply plot() them.

Or you can use package 'elevatr'.
But I'm sure there are many other alternative solutions.
HTH,
Ákos Bede-Fazekas
Centre for Ecological Research, Hungary


2021.11.16. 2:44 keltezéssel, ani jaya írta:

Dear r-sig-geo,

I need to plot a spatial map that shows a topography/elevation. It is
just to show a basic information so I do not need a high-resolution
map. I already googled and most of them use ggplot. Is there a
straightforward way to do this in base r? Any lead is appreciated.
Thank you so much.

Best regards,
Ani

#area of interest
library("maps")
map('world', xlim = c(100, 160), ylim = c(-45, -5),
 lwd=0.5, col = "grey95", fill = T, interior = T)
map.axes()

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Cropping Large RasterStack

2021-07-15 Thread Bede-Fazekas Ákos

Hello Jesús,
also you can try terra::crop(), it might be faster and/or more 
memory-efficient.

Have a nice week,
Ákos
---
Ákos Bede-Fazekas
Centre for Ecological Research, Hungary

2021.07.15. 10:04 keltezéssel, Khan, Saeed Akhtar írta:

Hi Rojo,

I think you should crop individual rasters through a loop and stack the results 
at the end.

Best,
Saeed

-Original Message-
From: R-sig-Geo  On Behalf Of Jesús Rojo
Sent: Thursday, July 15, 2021 9:55 AM
To: r-sig-geo@r-project.org
Subject: [R-sig-Geo] Cropping Large RasterStack

Hello everyone
I have a Large RasterStack (8501485725 elements, 336.8 Mb) compounded by 23750 
raster layers (0.1 degree of resolution) I would like to do a crop based on an 
extent

'tmax <- crop(tmax, extent(0, 20, 40, 55))’

It seems to be very time-consuming, and even it seems that my pc is not able to 
crop after a long time Do you know another efficient way to do this task 
without parallelizing?
I think that I did this task with large stacks in the past with ‘crop’ and it 
run well, but now I do not know if it is possible Thanks for your help Kind 
regards

Jesús Rojo
___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] reading geopackage gpkg with readOGR

2021-04-13 Thread Bede-Fazekas Ákos

Dear Patrick,
please take a look at sf::st_layers(dsn), this gives you the name of, 
and basic information on, the layers found in the specific file/folder 
(dsn).

HTH,
Ákos Bede-Fazekas

2021.04.13. 8:35 keltezéssel, Patrick Giraudoux írta:

Dear listers,

I did not find the way to open a geopackage using rgdal::readOGR.

The package name is 'trspixNarati_alt_UTM44.gpkg' (with two companion
files: 'trspixNarati_alt_UTM44.gpkg-shm' and
'trspixNarati_alt_UTM44.gpkg-wal') stored in a folder 'Shapes'

The corresponding object should be a SpatialPointsDataFrame.

I have tried several combinations (a bit blind) with

readOGR("./Shapes/trspixNaratiUTM44_alt_UTM44.gpkg")

readOGR("./Shapes/","trspixNaratiUTM44_alt_UTM44.gpkg")

readOGR("./Shapes/","trspixNaratiUTM44_alt_UTM44")

with no success...

This link https://jsta.github.io/2016/07/14/geopackage-r.html gives
indications on the how to to it, but supposes that one knows a layer
name (which is not my case)

Any idea about the syntax of this driver ?

Best,

Patrick


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Coordinates tolerance ERROR

2021-02-10 Thread Bede-Fazekas Ákos

Hello Gafarou,

(as I answered your question previously in max...@googlegroups.com...) 
the error message means that the points stored in Locations.csv are not 
in a regular grid, i.e., the distances between them are not the same. 
Although you can increase the tolerance parameter to eliminate the error 
message, this error usually indicates a deeper problem with the data 
that can not be solved (just hidden) by the increased tolerance. Maybe 
the points form a regular grid in another projection, so before trying 
to cast your SpatialPointsDataFrame to SpatialGridDataFrame with 
gridded<-(), you should spTransform() it to the adequate CRS. Also you 
should check whether the data.frame 'ForMapping' contains all the needed 
X-Y pairs. If the following gives you FALSE, it may indicate the problem:
nrow(ForMapping) == length(unique(ForMapping$X)) * 
length(unique(ForMapping$X))

I would also give a chance to raster::rasterFromXYZ().
Please send us a reproducible example (e.g. the original Locations.csv) 
to let us reproduce the error and understand its reason.


HTH,
Ákos Bede-Fazekas

2021.02.10. 14:38 keltezéssel, Gafarou AGOUNDE írta:

Hi,
I am facing this error when running this code bellow;

suggested tolerance minimum: 0.662921
Error in points2grid(points, tolerance, round) :
   dimension 1: coordinate intervals are not constant

What does this error mean and how to fix it?
In fact, the Locations.csv file contains just two columns respectively X and Y 
in decimal degrees (e.g. X=2.07656   Y=7.08511).
The error occurs when executing the last row.

Here is the code:
ForMapping <- read.csv("Locations.csv")
str(ForMapping)
length(PredictedAbundance) == length(ForMapping$X)
ForMapping$PredictedAbundance <- PredictedAbundance
coordinates(ForMapping) <- ~ X + Y
gridded(ForMapping) <- TRUE

Your help will be very appreciated..!

Sent from Mail for Windows 10


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Consolidate data

2021-02-03 Thread Bede-Fazekas Ákos

Hello Oulhaci,

How your question is related to R statistical software? This mailing 
list is for R users/developers. If you do not want to solve the problem 
with/from R (e.g. RPyGeo) but with Python integrated into ArcGIS, you 
should ask your question in gis.stackexchange.com or in a Python- or 
ArcGIS-specific mailing list.


Have a nice week,
Ákos Bede-Fazekas

2021.02.03. 9:34 keltezéssel, Oulhaci Yasmine via R-sig-Geo írta:

Dear list,
I want to build two scripts with python ArcGIS Pro, the first one will 
consolidate (Roll-up) data  just like aggregation of spatial data.The second 
one is the opposite of consolidating (drilling down into data), For example, i 
have three hierarchy levels are country, state and district. These fields are 
geographic fields and I am able to plot them on three separate maps. The 
requirement is to create a top-down analysis such that when the user clicks on 
the country, the top-down analysis should take the user to the states and show 
all the states. Then when the user clicks on a state, the drilldown should 
bring the user to see the districts and show all the sub-districts.

Any suggestions would be very welcome.



[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Doubt about extract climate variables from raster.

2021-01-07 Thread Bede-Fazekas Ákos

Dear Pietro,

I think that, technically, your code is correct (although using a for 
loop is unnecessary, function extract() can deal with 
RasterStacks/RasterBricks and returns a matrix in this case). And, 
technically, also the following solutions would be correct:

- mean with na.rm,
- weighted mean without na.rm,
- weighted mean with na.rm.
Also you should consider using package 'exactextract', which can quickly 
and accurately extract values over polygons in a way that it handles 
grid cells that are partially covered by a polygon.

The "best" solution depends on your data and your aim.

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2021.01.07. 23:51 keltezéssel, Pietro Andre Telatin Paschoalino írta:

Hello everyone, I'm extracting climate variables (precipitation and 
temperature) from a grid (CRU data) to my shapefile.

To do this I'm using the function extract from the raster package.

In the extraction, I saw if the CRS are the same in the raster and in the shape 
(I change the crs of shape for the same as raster).

Initially, I tried to extract considering the weighted average of pixels with 
centroid inside the polygon, as I got more missings values, I ended up opting 
for the simple average. Still, I got less than 1% of the values ​​as missings 
(5 of 557 regions) that I completed using the value of the nearest neighbor.

I decided this because as I am not from this area I am trying to keep it as 
simple as possible so as not to make mistakes. I believe that to get the 
weighted average of some form I would have to change the resolution of the 
raster, or something like that.

I asked the opinion of a geographer and he said that the way I did it is 
correct.

In this way I am putting my code for extraction if someone can verify that it 
is correct, I would also like an opinion if the way that I extract is correct. 
This is very important for me and I'll appreciate if anyone can take a look.

If anyone can help, I can make the data available directly.

In general my code is:

setwd("mypath")
shape <- readOGR(dsn = ".", layer = "microfinal")

fn<-file.path("mypath\\cru_ts4.04.2011.2019.tmp.dat.nc")

ncfile <- nc_open(fn)
rasbrick <- stack(fn)

To get only the layers that I want:
rasbrick <- rasbrick[[70:81]]

changing to the same crs

shape2<- spTransform(shape, crs(rasbrick))

weath_dt = as.data.frame(matrix(NA,nrow(shape2@data),12))

First I got the ID regions of shape:

weath_dt[,1] = shape2@data[,1]

and extract:

for (i in 1:length(rasbrick@layers)) {
   weath_dt[,1+i] = raster::extract(rasbrick[[i]], shape2, mean)
}

After that I only pick the values for the missings by near neighbour.

Thank you.

Pietro Andre Telatin Paschoalino
Doutorando em Ciências Econômicas da Universidade Estadual de Maringá - PCE.

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] st_segmentize across east and west hemispheres

2020-12-29 Thread Bede-Fazekas Ákos

Dear Amanda,,

I think this unwanted, nearly horizontal segment is unavoidable if you 
simply plot() the results and you use a CRS such as 4326. Of course, you 
can st_transform() your sf object to e.g. Robinson or Miller projection 
before st_segmentize()-ing.
But this segment is correct in WGS-84 as well: its start and end points 
are in the correct place. It is only a visualization decision whether 
the start and end points are

1) connected with this unwanted line;
2) their connection is not displayed; or
3) 'connected' with two segments: one heading to the -180° longitude, 
the other one heading to the +180° longitude.


HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2020.12.29. 0:24 keltezéssel, Amanda Rehbein írta:

Dear r-sig-geo list,

I have a package called raytracing for calculating atmospheric Rossby wave
paths.
I need to get segments of the great circle or routes from some geographical
coordinates. st_segmentize is calculating them correctly. However, when I
need to connect two points in different hemispheres, east and west, it
creates an unwanted horizontal line, as shown in the following example. Is
it possible (and correct) to avoid or remove this horizontal line?


library(sf)
m <- rbind(c(100,-50),
c(-100,50))
sf <- st_sf(a=1,
geom=st_sfc(st_linestring(m)),
crs = 4326)
seg <- st_segmentize(sf, units::set_units(1000, km))
plot(seg, axes = TRUE, reset = FALSE, type = "p", pch = 16)
plot(seg$geom, add = TRUE, col = "red")
text(x = m[, 1], y = m[, 2] - 7, label = 1:2, col = "blue")

Many thanks.

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] sp:gIntersection warning message about projection

2020-10-21 Thread Bede-Fazekas Ákos

Hello Monica,

The format of CRS definition changed a lot in the last years. I saw 
similar errors thrown by package sf, and transforming or CRS 
modification solved the problem. Does rgeos::gIntersection() work if you 
run one of the following lines before intersection?:

proj4string(pln) <- proj4string(buff)
pln <- spTransform(pln, proj4string(buff))
pln <- spTransform(x = pln, CRSobj = "+proj=utm +zone=16 +datum=WGS84 
+units=m +no_defs"); buff<- spTransform(x = buff, CRSobj = "+proj=utm 
+zone=16 +datum=WGS84 +units=m +no_defs")


Have a nice week,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2020.10.21. 16:29 keltezéssel, Monica Palaseanu-Lovejoy írta:

  Hi,

I am using in my workflow gIntersection from sp package. Part of my
relevant sessionInfo is:
R version 4.0.3 (2020-10-10)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 10 x64 (build 17763)

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base

other attached packages:
[1] CircStats_0.2-6 boot_1.3-25 MASS_7.3-53 stringr_1.4.0
[5] rgeos_0.5-5 maptools_1.0-2  rgdal_1.5-18raster_3.3-13
[9] sp_1.4-4


So i am using 2  SpatialLinesDataFrame that are results of previous
computation. I am not sure i can give a viable examples, but maybe this can
be solved without. If not i will figure out how i can give you a workable
example. Or if you know if there are in R already loaded 2 different line
shapefiles with projection then i will see if i can replicate this problem.

But the main point is that my workflow was fine in R 4.0.0 and the older sp
version, and after i upgraded to the newest R and updated all packages i am
getting the projection warning.

So, first file:

  buff
class   : SpatialLinesDataFrame
features: 1
extent  : 560525.5, 561302.4, 4698908, 4701110  (xmin, xmax, ymin, ymax)
crs : +proj=utm +zone=16 +datum=WGS84 +units=m +no_defs
variables   : 2
names   : ID, buff_dist
value   :  2,   135

second file:

pln
class   : SpatialLinesDataFrame
features: 1
extent  : 560615.6, 560705.4, 4698905, 4699180  (xmin, xmax, ymin, ymax)
crs : +proj=utm +zone=16 +datum=WGS84 +units=m +no_defs
variables   : 2
names   :x,y
value   : 560615.564407584, 4698904.71208346

Please observe that both files have identical projections.

p1 <- gIntersection(buff, pln)
Warning message:
In RGEOSBinTopoFunc(spgeom1, spgeom2, byid, id, drop_lower_td,
unaryUnion_if_byid_false,  :
   spgeom1 and spgeom2 have different proj4 strings

This warning is baffling since both files have same projection:
crs(buff)
CRS arguments:
  +proj=utm +zone=16 +datum=WGS84 +units=m +no_defs

crs(pln)
CRS arguments:
  +proj=utm +zone=16 +datum=WGS84 +units=m +no_defs

Probably i am doing something wrong but for the life of me i cannot see it
as yet.

Thanks so much for any help, take care,
Monica

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Execute Extract function on several Raster with points layer

2020-09-17 Thread Bede-Fazekas Ákos
Dear Gaëtan,
(cc r-sig-geo)
please post your mails in this topic to the mailing list.
I don't really know what does 'my "tmax_filesnames" object is a "large 
character"' mean. tmax_filenames is a typical character vector of 3660 
elements, so it should not cause any problem.
Anyway, the error message indicates that one or more of the file names 
are not correct. You should carefully check whether tmax_filenames was 
generated apropriately.
Best wishes,
Ákos

2020.09.17. 21:00 keltezéssel, Gaetan Martinelli írta:
> Hello again,
>
> The Stack function doesn't work because my "tmax_filesnames" object is 
> a "large character".
> Here is the error message I received after this line on my script :
> > tmax_raster <- raster::stack(tmax_filenames)
> Error in .local(.Object, ...) :
> Error in .rasterObjectFromFile(x, band = band, objecttype = 
> "RasterLayer",  :
>   Cannot create a RasterLayer object from this file. (file does not exist)
>
> How do I rectify this error?
> I transform my object Large Character?
>
> Thanks again Àkos.
>
> Gaëtan
>
> *Envoyé:* jeudi 17 septembre 2020 à 13:57
> *De:* "Bede-Fazekas Ákos" 
> *À:* r-sig-geo@r-project.org
> *Objet:* Re: [R-sig-Geo] Execute Extract function on several Raster 
> with points layer
> Hello Gaëtan,
>
> so as far as I understand, you have 3 main folders:
> "Max_T", ? and ?
> and in alll the three folders, there are subfolders
> "1961", "1962", ... "1970"
> In each folder, there are 366 raster files, for which the file naming
> conventions are not known by us, but some of the files are called
> "max1961_1.asc", "max1961_2.asc", ... "max1961_366.asc" (in case of
> T_max and year 1961)
>
> In this case, the 10980 layer that belongs to T_max can be read to one
> large RasterStack in this way:
> tmax_filenames <- c(outer(X = as.character(1:366), Y =
> as.character(1961:1970), FUN = function(doy, year) paste0("N:/400074
> Conservation des sols et CC/Data/Climate data/Climate-10km/Max_T/",
> year, "/max", year, "_", doy, ".asc")))
> tmax_raster <- stack(tmax_filenames)
>
> You can give self-explanatory names to the raster layers:
> names(tmax_raster) <- c(outer(X = as.character(1:366), Y =
> as.character(1961:1970), FUN = function(doy, year) paste0(year, "_", 
> doy)))
>
> But if the structure of the rasters are the same (i.e. the cell size,
> extent, projection), then I recommend you to do the raster-vector
> overlay once, save the cell numbers that you are interested in, and then
> in nested for loops (one loop for the climate variable, one for the year
> and one for the day) read the rasters one-by-one, extract the values
> according to the cell numbers, and save the result in a previously
> created data.frame. In this way, you may not encounter memory issues.
> Although, it will take a lot of time...
>
> HTH,
> Ákos Bede-Fazekas
> Hungarian Academy of Sciences
>
> 2020.09.17. 19:28 keltezéssel, Gaetan Martinelli írta:
> > Hello everyone, R team,
> >
> > Sorry in advance for this long message. Your help will be invaluable.
> >
> > For a few days now i have been blocked to execute a task on R. I will
> > try to synthesize my problem.
> >
> > I have several raster. I have an ASCII file for each day of a year
> > with a single band. For 30 years, and for three climatic variables on
> > grid 10km/10km (T_min, T_max, Precipitation). So i have a total around
> > of 32 940 raster files (366days*30years*3variables).
> >
> > Also, i have a layer of aroud 1000 points.
> >
> > I tried to use the Stack function and then make the intersection for
> > each raster files with my 1000 points.
> > I cannot create an independent matrix for all my files where i applied
> > the "extract" function, to then concatenate all my matrices in order
> > to have a single table.
> >
> > I tried this, exemple for 10 years et only T_Max (my files are
> > organized the same for my two other variables)  :
> > *#Datapoints*
> > Datapoints<-readOGR(dsn="H:/Inventaire/R/final",
> >                layer="Centroid_champs")
> > Datapoints<- spTransform (Datapoints, CRS ("+init=epsg:4326") ) # 1022
> > points in the data
> > st_crs(Datapoints)
> > *#Rasters files*
> > folders = list(
> >   file.path('N:','Data','Climate data','Climate-10km','Max_T','1961'),
> > #Each year includes daily data, the names of my several raster is
> > "max1961_1", "max1961_10", "max1961_10

Re: [R-sig-Geo] Execute Extract function on several Raster with points layer

2020-09-17 Thread Bede-Fazekas Ákos
Hello Gaëtan,

so as far as I understand, you have 3 main folders:
"Max_T", ? and ?
and in alll the three folders, there are subfolders
"1961", "1962", ... "1970"
In each folder, there are 366 raster files, for which the file naming 
conventions are not known by us, but some of the files are called
"max1961_1.asc", "max1961_2.asc", ... "max1961_366.asc" (in case of 
T_max and year 1961)

In this case, the 10980 layer that belongs to T_max can be read to one 
large RasterStack in this way:
tmax_filenames <- c(outer(X = as.character(1:366), Y = 
as.character(1961:1970), FUN = function(doy, year) paste0("N:/400074 
Conservation des sols et CC/Data/Climate data/Climate-10km/Max_T/", 
year, "/max", year, "_", doy, ".asc")))
tmax_raster <- stack(tmax_filenames)

You can give self-explanatory names to the raster layers:
names(tmax_raster) <- c(outer(X = as.character(1:366), Y = 
as.character(1961:1970), FUN = function(doy, year) paste0(year, "_", doy)))

But if the structure of the rasters are the same (i.e. the cell size, 
extent, projection), then I recommend you to do the raster-vector 
overlay once, save the cell numbers that you are interested in, and then 
in nested for loops (one loop for the climate variable, one for the year 
and one for the day) read the rasters one-by-one, extract the values 
according to the cell numbers, and save the result in a previously 
created data.frame. In this way, you may not encounter memory issues. 
Although, it will take a lot of time...

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2020.09.17. 19:28 keltezéssel, Gaetan Martinelli írta:
> Hello everyone, R team,
>
> Sorry in advance for this long message. Your help will be invaluable.
>
> For a few days now i have been blocked to execute a task on R. I will 
> try to synthesize my problem.
>
> I have several raster. I have an ASCII file for each day of a year 
> with a single band. For 30 years, and for three climatic variables on 
> grid 10km/10km (T_min, T_max, Precipitation). So i have a total around 
> of 32 940 raster files (366days*30years*3variables).
>
> Also, i have a layer of aroud 1000 points.
>
> I tried to use the Stack function and then make the intersection for 
> each raster files with my 1000 points.
> I cannot create an independent matrix for all my files where i applied 
> the "extract" function, to then concatenate all my matrices in order 
> to have a single table.
>
> I tried this, exemple for 10 years et only T_Max (my files are 
> organized the same for my two other variables)  :
> *#Datapoints*
> Datapoints<-readOGR(dsn="H:/Inventaire/R/final",
>                layer="Centroid_champs")
> Datapoints<- spTransform (Datapoints, CRS ("+init=epsg:4326") ) # 1022 
> points in the data
> st_crs(Datapoints)
> *#Rasters files*
> folders = list(
>   file.path('N:','Data','Climate data','Climate-10km','Max_T','1961'), 
> #Each year includes daily data, the names of my several raster is 
> "max1961_1", "max1961_10", "max1961_100", etc...
>   file.path('N:','Data','Climate data','Climate-10km','Max_T','1962'),
>   file.path('N:','Data','Climate data','Climate-10km','Max_T','1963'),
>   file.path('N:','Data','Climate data','Climate-10km','Max_T','1964'),
>   file.path('N:','Data','Climate data','Climate-10km','Max_T','1965'),
>   file.path('N:','Data','Climate data','Climate-10km','Max_T','1966'),
>   file.path('N:','Data','Climate data','Climate-10km','Max_T','1967'),
>   file.path('N:','Data','Climate data','Climate-10km','Max_T','1968'),
>   file.path('N:','Data','Climate data','Climate-10km','Max_T','1969'),
>   file.path('N:','Data','Climate data','Climate-10km','Max_T','1970')
> )
> files = unlist(sapply(folders, function(folder) {
>   list.files(folder, full.names=TRUE)
> }))
> files
>
> MET <- lapply(files, raster)
> s <- raster::stack(MET)
>
> output <- list()
> for(i in 1:length(MET)){
>   output[[i]] <- extract(s, Datapoints)
>   names(output)[[i]] <- paste("Année", MET[i], sep = "_")
> }
> Also, i tried that :
> p1 <- 1022 (ID of my DataPoints) ; p2 <- 1 (column where there are the 
> values ​​extracted from my raster) ; p3 <- 3660      # 3660matrix (366 
> day* 10 years)
> matlist <- list(array(NA,c(p1,p2,p3)))  # doing a list of independant 
> matrix
>
> for(i in seq_along(MET)){
>
>   matlist[[i]] <- extract(s, Datapoints)
> }
> But, nothing works...
> I would like my script to perform these actions :
> - For each Raster in my Rasterstack, extract the climatic data values 
> ​​and link them to my "Datapoints",
> - Take the name of my file, take the first three characters of the 
> name to get a column of my weather variable, here, "T_Max" (column 
> with my raster values) ; Take the following four characters then 
> report this information in a new column "Year", and finally, take the 
> last characters of the file name to create a new column "Day".
> - Concatenate all the independent output matrices corresponding to 
> each intersection made with my different raster files
> In the end, 

Re: [R-sig-Geo] Stack of several Raster

2020-09-14 Thread Bede-Fazekas Ákos
Dear Gaëtan,

"order of appearance of my different columns corresponding to my 366 
files (1 year) does not make any sense": do you mean that the order of 
the columns in TMAX60 is not what you want? The column order should 
match the order of the layers in your RasterStack. You can check it with:
names(s)
The order of the layers will correspond to the list of file names stored 
in fs. So you should arrange the file names in the appropriate order, 
and then you'll get columns in TMAX60 in the desired order. Maybe not 
list.files() is the best fuction to use. If the file names follow a 
regular pattern, then you can simply create the character vector of the 
file names with paste() or paste0().

HTH,
Ákos Bede-FAzekas
Hungarian Academy of Sciences


2020.09.14. 18:40 keltezéssel, Gaetan Martinelli írta:
> Good morning all,
>
>
> I would need some advice to move forward.
> I'm starting to use Rstudio to manipulate a climate data and intersect 
> with a differents points on my map (table of 1000 points with 6 
> attributes).
> Also, I have several raster. I have an ASCII file for each day of a 
> year. For 30 years.
>
>
> I tried to use the Stack function and then make the intersection with 
> my 1000 points for one year :
> fs <- list.files(path="N:/Climate-10km/Max_T/1960",
>                  pattern = "asc$", full.names = TRUE)
> s <- raster::stack(fs)
> TMAX60 <- extract(s, DataPoints)
> I'm stuck just at this part ...
> My intersection is good, but the order of appearance of my different 
> columns corresponding to my 366 files (1 year) does not make any 
> sense. They are out of order.
>
> My goal is to create a table with my 6 points attributes, and add a 
> "Year" column and a "Days" column and finally a "Pixel values" column. 
> For a year at first time and if possible doing this intersection for 
> all my years.
> I had the idea to do this for each year and then join my different 
> produced tables.
> In the end, I would have a huge table, but one that will allow me to 
> do my analysis. By calculating this, I will get:
> - 9 attributes (6 attributs points, Year, Day, TMax) and 10 980 000 
> lines (30 years (past) * 366 days * 1000 points).
> Does this seem like a good approach to you ?
> Do you have an idea that could allow me to do this task without going 
> through multiple table operations with "Dplyr" ?
>
> Do you have any advice for a better approach I could take?
>
> Sorry if my English is not always appropriate. I'm not very used to 
> talking in this language yet. I'm working on it. :)
>
> Thank you very much in advance to those who can help me.
>
> Have a good day.
>
> Gaëtan.
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Spatial correlation between two 'sf' kriging objects

2020-08-17 Thread Bede-Fazekas Ákos

Dear Nicola,
you are right, the class of the result of st_distance is not numeric but 
units (if CRS != NA). You may convert it to numeric with as.numeric().

HTH,
Ákos

2020.08.15. 23:15 keltezéssel, Nicola Gambaro írta:

Dear Ákos,

Thank you very much for your help.

I ran your code but unfortunately it returns the error:

Error in Ops.units(distance_matrix[as_units(point_number), , drop = TRUE],  :
both operands of the expression should be "units" objects

Any ideas how to fix this?

Cheers!

Nicola Gambaro
BSc Environmental Geoscience, First Class
Durham University


Message: 1
Date: Fri, 14 Aug 2020 14:01:43 +0200
From: =?UTF-8?Q?Bede-Fazekas_=c3=81kos?= 
To: r-sig-geo@r-project.org
Subject: Re: [R-sig-Geo]  Spatial correlation between two 'sf' kriging
objects
Message-ID: <043f13cc-20e1-a768-3419-03301f8d8...@gmail.com>
Content-Type: text/plain; charset="utf-8"; Format="flowed"

Dear Nicola,

Instead of raster::focal(), you can apply the cor() function in
combination with st_distance() (or st_buffer() and st_within()). E.g.
let's say that 'grid' is a POINT type sf object containing columns
'temperature' and 'yield':
distance_threshold <- 40
distance_matrix <- st_distance(grid)
grid$correlation <- vapply(X = 1:nrow(grid), FUN.VALUE = numeric(1), FUN
= function (point_number) {cor(x = st_set_geometry(grid,
NULL)[distance_matrix[point_number, distance_matrix[point_number, , drop
= TRUE] < distance_threshold , drop = TRUE], "temperature"], y =
st_set_geometry(grid, NULL)[distance_matrix[point_number,
distance_matrix[point_number, , drop = TRUE] < distance_threshold , drop
= TRUE], "yield"])})

Or something like this... I have not tested this code, and am sure that
it is not the most efficient solution.

For large, square grids, raster might be faster than sf. You can convert
your grid to RasterLayer with function rasterFromXYZ() combined with
st_coordinates(), st_set_geometry(, NULL) and cbind.data.frame(). There
might be more straightforward solutions for the conversion...

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences


2020.08.13. 20:11 keltezéssel, Nicola Gambaro írta:

I have created two ‘sf’ kriging objects (point vectors), one for temperature 
and another for agricultural yields. To make the grid and carry out the point 
interpolation, I have remained within the ‘sf’ package.

I would now like to create a spatial local correlation ‘raster’ between these two 
variables, as shown on this webpage 
https://statnmap.com/2018-01-27-spatial-correlation-between-rasters/ 
. 
However, in that example, they use the ‘raster’ package and the ‘focal’ function. I 
was wondering if there was a way of doing this within ‘sf’, i.e. without having to 
change classes? If not, what is the best way to convert those objects into raster 
classes?

Here is an excerpt of my kriging code for reference:
library(sf)
sf_data <- st_as_sf(x = data, coords = c("longitude", "latitude"), crs = 4326)
library(gstat)
vgm_utci <- variogram(UTCI~1, sf_data)
utci_fit <- fit.variogram(vgm_utci, vgm("Gau"), fit.kappa = TRUE)
plot(vgm_utci, utci_fit)
istria <- read_sf(“./Istria_Boundary.shp")
istria <- istria$geometry
istria.grid <- istria %>%
   st_make_grid(cellsize = 0.05, what = "centers") %>%
   st_intersection(istria)
library(ggplot2)
ggplot() + geom_sf(data = istria) + geom_sf(data = istria.grid)
library(stars)
utci_krig <- krige(formula = sf_data$UTCI ~ 1, locations = sf_data,
newdata = istria.grid, model = utci_fit)


Thank you very much in advance,

Nicola
[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Spatial correlation between two 'sf' kriging objects

2020-08-14 Thread Bede-Fazekas Ákos

Dear Nicola,

Instead of raster::focal(), you can apply the cor() function in 
combination with st_distance() (or st_buffer() and st_within()). E.g. 
let's say that 'grid' is a POINT type sf object containing columns 
'temperature' and 'yield':

distance_threshold <- 40
distance_matrix <- st_distance(grid)
grid$correlation <- vapply(X = 1:nrow(grid), FUN.VALUE = numeric(1), FUN 
= function (point_number) {cor(x = st_set_geometry(grid, 
NULL)[distance_matrix[point_number, distance_matrix[point_number, , drop 
= TRUE] < distance_threshold , drop = TRUE], "temperature"], y = 
st_set_geometry(grid, NULL)[distance_matrix[point_number, 
distance_matrix[point_number, , drop = TRUE] < distance_threshold , drop 
= TRUE], "yield"])})


Or something like this... I have not tested this code, and am sure that 
it is not the most efficient solution.


For large, square grids, raster might be faster than sf. You can convert 
your grid to RasterLayer with function rasterFromXYZ() combined with 
st_coordinates(), st_set_geometry(, NULL) and cbind.data.frame(). There 
might be more straightforward solutions for the conversion...


HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences


2020.08.13. 20:11 keltezéssel, Nicola Gambaro írta:

I have created two ‘sf’ kriging objects (point vectors), one for temperature 
and another for agricultural yields. To make the grid and carry out the point 
interpolation, I have remained within the ‘sf’ package.

I would now like to create a spatial local correlation ‘raster’ between these two 
variables, as shown on this webpage 
https://statnmap.com/2018-01-27-spatial-correlation-between-rasters/ 
. 
However, in that example, they use the ‘raster’ package and the ‘focal’ function. I 
was wondering if there was a way of doing this within ‘sf’, i.e. without having to 
change classes? If not, what is the best way to convert those objects into raster 
classes?

Here is an excerpt of my kriging code for reference:
library(sf)
sf_data <- st_as_sf(x = data, coords = c("longitude", "latitude"), crs = 4326)
library(gstat)
vgm_utci <- variogram(UTCI~1, sf_data)
utci_fit <- fit.variogram(vgm_utci, vgm("Gau"), fit.kappa = TRUE)
plot(vgm_utci, utci_fit)
istria <- read_sf(“./Istria_Boundary.shp")
istria <- istria$geometry
istria.grid <- istria %>%
   st_make_grid(cellsize = 0.05, what = "centers") %>%
   st_intersection(istria)
library(ggplot2)
ggplot() + geom_sf(data = istria) + geom_sf(data = istria.grid)
library(stars)
utci_krig <- krige(formula = sf_data$UTCI ~ 1, locations = sf_data,
newdata = istria.grid, model = utci_fit)


Thank you very much in advance,

Nicola
[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Image reprojection and resampling

2020-07-10 Thread Bede-Fazekas Ákos

Dear Enoch,

Without a reproducible example it is hard to guess what you data looks 
like and what causes the unexpected results.
Anyway, if you are sure that raster object called "Trying" has perfect 
CRS and resolution, then this may solve your problem:

Sentinel <- projectRaster(from = Sentinel, to = Trying)

Have a nice weekend,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2020.07.10. 12:46 keltezéssel, Enoch Gyamfi Ampadu írta:

Dear list,

Please I am working with a Sentinel 2 image of 20 m resolution. I have
tried to reproject and aggregate the resolution to 50 m. But after doing
the reprojection, I get the resolution in decimals and I am not sure what
factor to use for the escalation.

I tried another means to reproject with an existing image of similar
resolution 20 m using this code; Sentinel <- projectRaster(Sentinel, crs =
crs(Trying), res = 20) and
Sentinel <- projectRaster(Sentinel,crs = crs(Trying), keepres= TRUE)
After the process, I get an empty image when I plot.

I will be glad if I could have some insights on how to deal with these
issues of the reprojection and aggregation/resampling.

Best regards,

Enoch



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Error when saving an sf (data) object to file as a shapefile

2020-06-20 Thread Bede-Fazekas Ákos

Dear Lom,

Since you did not provide reproducible example, I can only guess...
You used merge(, all.y = TRUE). This might produce an sf object of type 
GEOMETRY and some "GEOMETRYCOLLECTION EMPTY" in the geometry column, if 
some of the key values of y (balance5) are not found in the key values 
of x (parcel1). ESRI shapefile cannot accept mixed geometries, this 
might be the reason that you cannot write the dataset to shp format.


HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences


2020.06.20. 10:43 keltezéssel, Lom Navanyo írta:

Hello,

I have had to merge a shapefile that I read into R as an sf object with a
.csv data containing  some variables. Now I want to save the merged data to
a file (a folder on my pc). I am however getting following error:

Error in CPL_write_ogr(obj, dsn, layer, driver,
as.character(dataset_options),  :
   Write error

Below is a snippet of code used:
library(sf)
library(dplyr)
library(ggplot2)
library(stringr)
library(rgdal)
library(sp)

parcel1 <- st_read("parcels_all.shp")
balance5 <- read.csv("Balanced_5.csv")

mergedparcel <- merge(parcel1, balance5, by=c('PARCEL_ID','CAL_YEAR'),
all.x = FALSE, all.y=TRUE)

st_write(mergedparcel,"mergedparcel.shp")

I also used the shapefile function thus:

shapefile(mergedparcel , "D:/Documents/mergedparcel.shp")
  This also gives me:
Error in shapefile(mergedparcel, "D:/Documents/
Documents/mergedparcel.shp") :
   could not find function "shapefile"

Am I doing this right?
Any suggestion to resolve this issue would be appreciated.

-
Lom

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] ggplot grids polygons, points and lines with sperate legens

2020-05-28 Thread Bede-Fazekas Ákos

Hello Harry,
it is not the final solution, just one a part of it:

fill_levels <- unique(c(polys$Fill, pts$Value))
polys$Fill <- factor(x = polys$Fill, levels = fill_levels)
pts$Value <- factor(x = pts$Value, levels = fill_levels)
ponts$Value <- as.factor(ponts$Value)
ggplot()+
    geom_tile(data=pts, aes(x=X,y=Y,fill=Value), show.legend=F) +
    geom_sf(data=polys, aes(fill=Fill), alpha=0.5, 
size=2,show.legend="polygon")+

    geom_sf(data=lins, aes(color=Line),show.legend="line", size=5) +
    geom_sf(data=ponts, aes(shape=Value), size=10, show.legend="point") +
    scale_fill_manual("Grid",values=c(ptscols,polycols)) +
    scale_color_manual("Lines",values=lncols) +
scale_shape_manual("Points",values=as.numeric(levels(ponts$Value))) +
    guides(fill = guide_legend(override.aes = list(linetype = 0, shape 
= 0)), color = guide_legend(override.aes = list(shape = 0, linetype = 
"solid", size=5)), shape = guide_legend(override.aes = list(linetype = 
0, size=10)))


HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2020.05.28. 17:10 keltezéssel, Herr, Alexander (L, Black Mountain) írta:

Hi,

I am unsuccessfully trying to adjust the legend for a multiple feature plot of 
geom_tile, and geom_sf (with polygons, lines and points).  I would like to have 
1) the geom_tile legend separate from the geom_sf polygon legend and also 2) 
have the legend descriptors ordered descending and 3) each legend for polygon, 
line and point should have the respective pattern/color, that is rectangle for 
polygons, lines for lines and shapes for points.

Reproducible code below. Any suggestions?
Thanks
Herry


require(sf)
require(sp)

pols<-st_as_sf(spPolygons(rbind(c(1,1),c(1,10),c(10,10),c(10,1),c(1,1
p1 =st_sfc((st_polygon(list(rbind(c(0,0), c(0,10), c(10,10), c(10,0), 
c(0,0))
pts<-as.data.frame(st_coordinates(st_make_grid(p1,cellsize=1,what='centers')) 
)%>%mutate(Value=unlist(lapply(1:10,function(x) sample.int(10,n=3, replace=T))) )
lins<-st_as_sf(rbind(spLines(rbind(c(2,3),c(9,9))),spLines(rbind(c(2,7),c(8,1)%>%
   transmute(Line=c("6","7"))
p2<-st_as_sf(spPolygons(rbind(c(1,1),c(9,9),c(9,4),c(1,1
p3<-st_as_sf(spPolygons(rbind(c(1,1),c(1,7),c(7,7),c(7,5),c(1,1
polys<-st_as_sf(rbind(p2,p3))%>%transmute(Fill=c("4","5"))
ponts<-st_as_sf(pts,coords=c("X","Y") )

ptscols<-c("salmon","light blue","orange")
polycols<-c("dark red","aquamarine")
lncols<-c("deepskyblue","gold")

ggplot()+geom_tile(data=pts, aes(x=X,y=Y,fill=as.character(Value)), 
show.legend=T )  +
  geom_sf(data=polys, aes(fill=as.character(Fill)), alpha=0.5, 
size=2,show.legend=T)+
  geom_sf(data=lins, aes(color=Line),legend=T, size=5) +
  geom_sf(data=ponts, aes(shape=as.character(Value)), size=10, 
show.legend=T)+
  scale_fill_manual("Grid",values=c(ptscols,polycols),
   breaks = c(as.character(unique(pts$Value)),polys$Fill),
 guide = guide_legend(overrid.aes=list(linetype = NULL, 
shape=NULL))) +
   scale_color_manual("Lines",values=lncols, breaks = lins$Line,
 guide = guide_legend(override.aes = list(linetype = 
"solid", size=5)))   +
   
scale_shape_manual("Points",values=unique(as.character(ponts$Value)), breaks = 
ponts$Value,
 guide = guide_legend(override.aes = list(size=10)))


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Comparing 2 rasterbricks using functions from TSdist package

2020-05-28 Thread Bede-Fazekas Ákos

Dear Jackson,

I think this is what you are searching for:
new_s3 <- setValues(r, sapply(1:ncell(r), function(i) 
fun(as.vector(s[i]), as.vector(s2[i]

plot(new_s3, colNA = "red")

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences



2020.05.28. 0:18 keltezéssel, Jackson Rodrigues írta:

Dear all,

Few days ago I got a important help from this list and I am really thankful
for that.
Today I have a similar challenge with 2 rasterbrick files.

I would like to compare 2 rasterbricks using any similarity function (let's
take CorDistance) from package TSdist.
In my real data, both rasterbricks came from the same region but using
different measurement techniques.

I am trying to adapt the suggestion I got from here rather than looping on
each cell, but no matter what I try I always get an error related to NA.
It seems to be very simples but I am stuck since last Monday.

I have played with several conditional ("ifelse", "if", "if else" etc) and
logical tests ("|", "||", "&" etc) but I got nothing so far.

Could someone help me and show me a path to go through?
Below there is the cleanest code I have tried.

Best regards

Jackson

##
library(raster)
library(TSdist)

set.seed(12)
r <- raster(nrow=10, ncol=10)
s <- lapply(1:200, function(i) setValues(r, rnorm(ncell(r),
   sample.int(5,1), 0.5)))
s <- stack(s)
s[s < 0] <- NA
s2<-s^2

# Visualize that some pixels have NAs and other don't
hasna <- stackApply(s, indices = 1, fun = function(x, na.rm){anyNA(x)})
hasna2 <- stackApply(s2, indices = 1, fun = function(x, na.rm){anyNA(x)})

par(mfrow=c(1,2))
plot(hasna);plot(hasna2)

# CorDistance function does not handle NAs
CorDistance(as.vector(s[1]),as.vector(s2[1]))
[1] 0.2192382

CorDistance(as.vector(s[2]),as.vector(s2[2]))

   [1] NA

fun <- function(x,y){
   out <- ifelse(anyNA(x) | anyNA(y),
 yes = NA,
 no = unname(CorDistance(x,y)))
   return(out)
}

# Check that it works
fun(as.vector(s[1]),as.vector(s2[1]))
[1] 0.2192382

# Check that it does not work
fun(as.vector(s[2]),as.vector(s2[2]))
[1] NA

new_s1 <- fun(s,s2)


# Check that it does not work with calc
new_s2 <- calc(s,s2, fun = fun)
Error in .calcTest(x[1:5], fun, na.rm, forcefun, forceapply) :
   cannot use this function

plot(s_out)
#



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Field names abbreviated for ESRI Shapefile driver

2020-04-19 Thread Bede-Fazekas Ákos

Dear Milu,
ESRI shapefile format cannot contain field names longer than 10 
characters. So you should do one of the following options:

- choose other file format when saving the data,
- rename the columns of the sf object before saving.

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences


2020.04.19. 17:37 keltezéssel, Miluji Sb írta:

Greetings,

i am trying to export a shapefile using the following command;

st_write(weight.sf, "weight.shp, driver="ESRI Shapefile")

But getting the following warning;

1: In abbreviate_shapefile_names(obj) :
   Field names abbreviated for ESRI Shapefile driver

The variable names have the following pattern; "weight_2010" but
become like this "w__201". How can I preserve field/variable names? Thank
you.

Best regards,

Milu

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] indexing multi-layer rasters

2020-01-17 Thread Bede-Fazekas Ákos



Dear Ben,
Although I cannot answer your question on why logical subsetting was not 
implemented in package raster, there is a very easy workaround:

logo[[(1:nlayers(logo))[c(TRUE, TRUE, TRUE)]]]
logo[[(1:nlayers(logo))[c(TRUE, FALSE, TRUE)]]]

Also note that in case of lists '[[' does recursive indexing, and this 
type of logical indexing you are asking about works only with '['.

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2020.01.16. 17:50 keltezéssel, Ben Tupper írta:

Hi,

I usually avoid using logical indexes with multilayer rasters (stacks
and bricks), but I have never understood why indexing ala `[[` with
logicals isn't supported.  Below is an example that shows how it
doesn't work like the typical indexing with logicals.  I'm not asking
to have logical indices considered (although it would be handy), but
instead I really just want to understand why it is the way it is.  I
scanned over the introductory vignette (https://rspatial.org/raster)
and issues (https://github.com/rspatial/raster/issues) but found
nothing there.

Thanks and cheers,
Ben

### START
library(raster)

logo <- raster::brick(system.file("external/rlogo.grd", package="raster"))

# red-red-red
logo[[c(TRUE, TRUE, TRUE)]]
# class  : RasterStack
# dimensions : 77, 101, , 3  (nrow, ncol, ncell, nlayers)
# resolution : 1, 1  (x, y)
# extent : 0, 101, 0, 77  (xmin, xmax, ymin, ymax)
# crs: +proj=merc +datum=WGS84 +ellps=WGS84 +towgs84=0,0,0
# names  : red.1, red.2, red.3
# min values : 0, 0, 0
# max values :   255,   255,   255

# red-red
logo[[c(TRUE, TRUE)]]
# class  : RasterStack
# dimensions : 77, 101, , 2  (nrow, ncol, ncell, nlayers)
# resolution : 1, 1  (x, y)
# extent : 0, 101, 0, 77  (xmin, xmax, ymin, ymax)
# crs: +proj=merc +datum=WGS84 +ellps=WGS84 +towgs84=0,0,0
# names  : red.1, red.2
# min values : 0, 0
# max values :   255,   255

# red
logo[[c(TRUE)]]
# class  : RasterLayer
# band   : 1  (of  3  bands)
# dimensions : 77, 101,   (nrow, ncol, ncell)
# resolution : 1, 1  (x, y)
# extent : 0, 101, 0, 77  (xmin, xmax, ymin, ymax)
# crs: +proj=merc +datum=WGS84 +ellps=WGS84 +towgs84=0,0,0
# source : /usr/lib64/R/library/raster/external/rlogo.grd
# names  : red
# values : 0, 255  (min, max)

logo[[c(TRUE, FALSE, TRUE)]]
#Error in .local(x, ...) : not a valid subset


#sessionInfo()
# R version 3.5.1 (2018-07-02)
# Platform: x86_64-redhat-linux-gnu (64-bit)
# Running under: CentOS Linux 7 (Core)
#
# Matrix products: default
# BLAS/LAPACK: /usr/lib64/R/lib/libRblas.so
#
# locale:
#   [1] LC_CTYPE=en_US.UTF-8   LC_NUMERIC=C
LC_TIME=en_US.UTF-8LC_COLLATE=en_US.UTF-8
# [5] LC_MONETARY=en_US.UTF-8LC_MESSAGES=en_US.UTF-8
LC_PAPER=en_US.UTF-8   LC_NAME=C
# [9] LC_ADDRESS=C   LC_TELEPHONE=C
LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C
#
# attached base packages:
#   [1] stats graphics  grDevices utils datasets  methods   base
#
# other attached packages:
#   [1] raster_3.0-7 sp_1.3-2
#
# loaded via a namespace (and not attached):
#   [1] compiler_3.5.1   rgdal_1.4-8  tools_3.5.1  yaml_2.2.0
  Rcpp_1.0.3   codetools_0.2-15
# [7] grid_3.5.1   lattice_0.20-35

### END






___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] how to get the value of a pixel and its 8 surrounding pixels from points?

2019-11-05 Thread Bede-Fazekas Ákos

Dear Cristabel,
function focal() of package raster is what you are searching for.
focal(x, w = matrix(1/9, ncol = 3, nrow = 3), fun = sum)
or
focal(x, w = matrix(1, ncol = 3, nrow = 3), fun = mean)

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2019.11.05. 17:10 keltezéssel, "Cristabel Durán Rangel" írta:


I need to get the value of a pixel (I have coordinates for this 
pixels) and its 8 surrounding pixel from the pixel/point. I am working 
with NetCDF files. I am working with R.


This is my code till now:

nc <- nc_open(file, readunlim=FALSE)

mylon <- ncvar_get(nc,"lon")
mylat <- ncvar_get(nc,"lat")

my coordinates in real-world: lat 52.5935 lon 18.4467

lat.coor <-mylat[which.min(abs(mylat-52.5935))]
lon.coor <- mylon[which.min(abs(mylon-18.4467))]

var <- nc[lon.coor, lat.coor, ]

In var are the values for my point. But I also need the values of the 
8 surrounding pixels to get an average.


Thanks.


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Plotting median values globally

2019-10-29 Thread Bede-Fazekas Ákos

Hello,
It is because the wrong place of the closing bracket. This might work, 
since margin, at, cuts etc. are the parameters of levelplot(9 not raster():
levelplot(raster(RCP1pctCO2Median,61), 
margin=F,at=cutpoints,cuts=11,pretty=TRUE,par.settings=mapTheme, 
main="Median values")

HTH,
Ákos

2019.10.29. 17:38 keltezéssel, rain1290--- via R-sig-Geo írta:

Hi Vijay and others,
Thank you so, so much for this reply!! Yes, this worked, just as you suggested. 
Just one little detail that I wanted to know - I was trying to change the color 
range of my plot, like this:
mapTheme<-rasterTheme(region=rev(brewer.pal(10,"Spectral")))
cutpoints<-c(200,175,150,125,100,75,50,25,0)

levelplot(raster(RCP1pctCO2Median,61, 
margin=F,at=cutpoints,cuts=11,pretty=TRUE,par.settings=mapTheme, main="Median 
values"))




But I receive this error:
Error in .local(x, ...) :
   unused arguments (margin = F, at = cutpoints, cuts = 11, pretty = TRUE, par.settings = 
mapTheme, main = "Median values")
What could be the reason behind this error?
Thank you, once again!!
~Trav.~
-Original Message-
From: Vijay Lulla 
To: rain1290 
Cc: R-sig-geo Mailing List 
Sent: Tue, Oct 29, 2019 9:02 am
Subject: Re: [R-sig-Geo] Plotting median values globally

Did you try `levelplot(raster(RCP1pctCO2median, 61))`?

On Tue, Oct 29, 2019 at 12:17 AM rain1290--- via R-sig-Geo 
 wrote:

Hi there,
I am trying to create a global plot with median precipitation values. What I would like to do is 
isolate the "median" values and plot those values on a global raster map. I already first 
created a raster stack, called "RCP1pctCO2median", which contains the median values to be 
plotted. However, what I would like to do is simply plot only those values at Year 61 for each grid 
cell. There are 138 layers, with each layer representing one year. The idea would be to plot only 
those values for Year/layer 61 globally (i,e, for each grid cell), but I am unsure how to isolate 
that specific year/layer?
"RCP1pctCO2median"  has the following attributes:

class       : RasterStack
dimensions  : 64, 128, 8192, 138  (nrow, ncol, ncell, nlayers)
resolution  : 2.8125, 2.789327  (x, y)
extent      : -181.4062, 178.5938, -89.25846, 89.25846  (xmin, xmax, ymin, ymax)
coord. ref. : +proj=longlat +datum=WGS84 +ellps=WGS84 +towgs84=0,0,0
names       :    layer.1,    layer.2,    layer.3,    layer.4,    layer.5,    
layer.6,    layer.7,    layer.8,    layer.9,   layer.10,   layer.11,   
layer.12,   layer.13,   layer.14,   layer.15, ...
min values  : 0.42964514, 0.43375653, 0.51749371, 0.50838983, 0.45366730, 
0.53099146, 0.49757186, 0.45697752, 0.41382199, 0.46082401, 0.45516687, 
0.51857087, 0.41005131, 0.45956529, 0.47497867, ...
max values  :   96.30350,  104.08584,   88.92751,   97.49373,   89.57201,   
90.58570,   86.67651,   88.33519,   96.94720,  101.58247,   96.07792,   
93.21948,   99.59785,   94.26218,   90.62138, ...


And this is how I would like to create my global raster map. However, I am not 
sure how to plot only one value (i.e. the median value, which occurs at Year 
61) for each grid cell. This is what I have so far, but all that this does is 
plot 138 mini plots, and I just want one plot showing the median values (i.e. 
Year 61):


mapTheme<-rasterTheme(region=rev(brewer.pal(10,"Spectral")))
cutpoints<-c(200,175,150,125,100,75,50,25,0)
levelplot(RCP1pctCO2median,margin=F,at=cutpoints,cuts=11,pretty=TRUE,par.settings=mapTheme,
 main="Median values")



Any help would be greatly appreciated!
I look forward to your response!
         [[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo




[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Variable importance bar plot

2019-10-29 Thread Bede-Fazekas Ákos

Dear Enoch,

randomForest:varImpPlot() has a parameter called 'class' that is NULL by 
default but can be changed to get the class-specific measure of importance.
'geom= "bar"': do you want to plot the result using ggplot2 instead of 
the base plot produced by varImpPlot()? In this case you should firstly 
assign the invisiby returning result of varImpPlot() (or that of 
importance()) to a variable, and then plot it with ggplot().


HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences


2019.10.29. 15:51 keltezéssel, Sarah Goslee írta:

It's not entirely clear to me how you expect the values to be
calculated, but you might look at partial dependence plots, for
example as implemented in the pdp package.

Sarah

On Sun, Oct 27, 2019 at 6:00 AM Enoch Gyamfi Ampadu  wrote:

Dear List,

I hope this finds you well. Please I have done a Land cover classification
using randomForest. I used selected satellite image bands as input
variables. I have four thematic classes. I obtained the general variable
importance using *varImpPlot (my model) . *Now I want to have bar chart
plot showing each thematic class and the variable importance for each
class. That is, how each band is ranked in predicting a thematic class. I
tried using this function:

*importance = varImp(my model, scale= TRUE)*

*plot (importance. *

However it did not work. I tried to put in *geom= "bar*". And I had error.

Please I will be glad to get some assistance.

Thank you.

Best regards,

Enoch





___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Splitting polygon dataset

2019-08-23 Thread Bede-Fazekas Ákos

Dear Enoch,

There is a typo in your code that causes the error: function c() is not 
equivalent of C().

Solution:
id <- sample(2, nrow(dfTrainshape), prob = c(0.7, 0.3), replace = TRUE)

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2019.08.23. 12:17 keltezéssel, Enoch Gyamfi Ampadu írta:

Dear List,

Please I have been implementing Random Forest in R for the to classify
forest cover. I am doing it for 4 main classes. I Have extracted the pixel
values of the bands with that of the training polygons. In all I had 226
observations and the 8 bands as the response variables.

I tried to split it into 70% for training set and 30% for as testing set
sing the codes below;

#setting training and testing samples
set.seed(999)

id <- sample(2, nrow(dfTrainshape), prob = C(0.7, 0.3), replace = TRUE)
dfTrainshape_train <- dfTrainshape[id==1,]
dfTrainshape_test <- dfTrainshape[id==2,]

I had the error below;

set.seed(999)

id <- sample(2, nrow(dfTrainshape), prob = C(0.7, 0.3), replace = TRUE)

Error in C(0.7, 0.3) : object not interpretable as a factor

dfTrainshape_train <- dfTrainshape[id==1,]

Error in `[.data.frame`(dfTrainshape, id == 1, ) : object 'id' not found

I will be glad to have some advice and probable some code to assist me.

Secondly,

Please I also want to create a separate testing polygons for the validation
in ArcMap. I want to know how I will be able to use the 226 observations of
the earlier set of polygons for the training and the new polygons for
validation. I will be glad to have some codes which I can change to suite
what I want to do.

Hope to hear from you.

Best regards,

Enoch




___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Rasterize point process using specific rule

2019-08-20 Thread Bede-Fazekas Ákos

Dear Alexandre,
I'm not familiar with point processes, but as far as I understand the 
task well, it can be easily solved using sp+raster or sf+raster.


library(sf)
library(raster)
points_sf <- st_as_sf(data.frame(xp, yp, area), coords = c("xp", "yp"), 
crs = 32629)
circles_sf <- st_buffer(x = points, dist = sqrt(points_sf$area / pi)) # 
if it is really the area, or dist = points_sf$area if it is the radius

empty_raster <- raster(points_sf, resolution = 1)
filled_raster <- rasterize(x = circles_sf, y = empty_raster, field = 1, 
fun = function(x, na.rm) 1, background = 0)

plot(filled_raster)

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences


2019.08.20. 15:12 keltezéssel, ASANTOS via R-sig-Geo írta:

Dear members,

I've like to create a raster using my event coordinates and information
about the size in this coordinates as a rule for my raster creation.
First, I make:

library(raster)
library(spatstat)

#Points coordinates in UTM
xp<-c(371278.588,371250.722,371272.618,371328.421,371349.974,
371311.95,371296.265,371406.46,371411.551,371329.041,371338.081,
371334.182,371333.756,371299.818,371254.374,371193.673,371172.836,
371173.803,371153.73,371165.051,371140.417,371168.279,371166.367,
371180.575,371132.664,371129.791,371232.919,371208.502,371289.462,
371207.595,371219.008,371139.921,371133.215,371061.467,371053.69,
371099.897,371108.782,371112.52,371114.241,371176.236,371159.185,
371159.291,371158.552,370978.252,371120.03,371116.993)

yp<-c(8246507.94,8246493.176,8246465.974,8246464.413,8246403.465,
8246386.098,8246432.144,8246394.827,8246366.201,8246337.626,
8246311.125,8246300.039,8246299.594,8246298.072,8246379.351,
8246431.998,8246423.913,8246423.476,8246431.658,8246418.226,
8246400.161,8246396.891,8246394.225,8246400.391,8246370.244,
8246367.019,8246311.075,8246255.174,8246255.085,8246226.514,
8246215.847,8246337.316,8246330.197,8246311.197,8246304.183,
8246239.282,8246239.887,8246241.678,8246240.361,8246167.364,
8246171.581,8246171.803,8246169.807,8246293.57,8246183.194,8246189.926)

#Now I have the size of each nest in meters (marked process)

area<-c(117,30,4,341,15,160,35,280,108,168,63,143,2,48,182,42,
88,56,27,156,288,45,49,234,72,270,91,40,304,56,35,4,56.7,9,4.6,
105,133,135,23.92,190,12.9,15.2,192.78,104,255,24)

# Make a contour for the window creation
W <- convexhull.xy(xp,yp)

#Create ppm object
syn.ppp <- ppp(x=xp,y=yp,window=W, marks=area)

# Plot the point process
plot(syn.ppp)

#Definition of raster resolution - 1 meter square
ext <- as(extent(c(W$xrange,W$yrange)), "SpatialPolygons")
syn.ras.res<-rasterToPoints(raster(ext, resolution = 1), spatial = TRUE)

#Now Rasterize
syn.ras<- rasterize(syn.ras.res@coords, raster(syn.ras.res), *?*)

I' ve like to create some kind of rule in *?*, where pixel values in
my final raster was 1 when inside de area (marks=area) and zero for
outside de circular area given by points coordinates
(syn.ras.res@coords) using 1 meter as resolution?

Any ideas, please

Thanks,



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] slow computation progress for calc function

2019-06-25 Thread Bede-Fazekas Ákos
Dear Sara,
Hm.. so sorry, I have no more idea...
Ákos

2019.06.25. 9:55 keltezéssel, Sara Shaeri írta:
> Dear Ákos,
> I tried it on a small partition of my data. However, the processing 
> time is the same :(. Any other suggestion?
>
> -
> *Sara Shaeri Karimi*
> PhD Candidate, Department of Environmental Sciences
> Faculty of Science and Engineering
> Macquarie University, NSW 2109 Australia
> /Researchgate <https://www.researchgate.net/profile/Sara_Shaeri>
> /
> /Linkedin <https://www.linkedin.com/in/sara-shaeri-karimi-94351433>/
>
>
>
>
>
>
>
>
> On Tuesday, June 25, 2019, 4:33:34 PM GMT+10, Bede-Fazekas Ákos 
>  wrote:
>
>
> Dear Sara,
>
> it is faster if you first convert the integer vector to logical, and
> then run rle().
>
> set.seed(12345)
> random_sample <- sample.int(n = 10, size = 1e5, replace = TRUE) - 1
>
> original <- function(x){
>     y <- rle(x)
>     return(max(y$lengths[y$values == 0]))
> }
>
> faster <- function(x){
>     y <- rle(x == 0)
>     return(max(y$lengths[y$values]))
> }
>
> original(random_sample) == faster(random_sample)
>
> library(microbenchmark)
> microbenchmark(
>     original(random_sample),
>     faster(random_sample)
> )
>
> Hence, this may be faster:
> interflood <- clusterR(all_predictions, calc, args=list(function(x){y <-
> rle(as.numeric(x) == 0; return(max(y$lengths[y$values]))}))
>
> HTH,
> Ákos Bede-Fazekas
> Hungarian Academy of Sciences
>
> 2019.06.25. 3:32 keltezéssel, Sara Shaeri via R-sig-Geo írta:
> > Dear community,
> > I’m trying to use the Calc function in a raster stack (8000 binary 
> format rasters). Each raster covers 2,421,090 cells, as such, I’m 
> using parallel coding to make the most of the available cores for this 
> computation, however, this process is extremely slow on a 36 core and 
> 76 Gb RAM. How could I speed up the calculations? This is the code I’m 
> using:
> >
> > beginCluster(30)
> >
> > interflood <- clusterR(all_predictions, calc, 
> args=list(function(x){y <- rle(as.numeric(x));return(max( 
> y$lengths[y$values == 0]))}))
> >
> > endCluster()
> >
> >
> > RegardsSara
> >
> >
> > -
> > Sara Shaeri KarimiPhD Candidate, Department of Environmental 
> SciencesFaculty of Science and EngineeringMacquarie University, NSW 
> 2109 Australia
> > Researchgate
> > Linkedin
>
> >
> >
> >
> >
> >
> >
> >     [[alternative HTML version deleted]]
> >
> > ___
> > R-sig-Geo mailing list
> > R-sig-Geo@r-project.org <mailto:R-sig-Geo@r-project.org>
> > https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> >
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org <mailto:R-sig-Geo@r-project.org>
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] slow computation progress for calc function

2019-06-25 Thread Bede-Fazekas Ákos

Dear Sara,

it is faster if you first convert the integer vector to logical, and 
then run rle().


set.seed(12345)
random_sample <- sample.int(n = 10, size = 1e5, replace = TRUE) - 1

original <- function(x){
    y <- rle(x)
    return(max(y$lengths[y$values == 0]))
}

faster <- function(x){
    y <- rle(x == 0)
    return(max(y$lengths[y$values]))
}

original(random_sample) == faster(random_sample)

library(microbenchmark)
microbenchmark(
    original(random_sample),
    faster(random_sample)
)

Hence, this may be faster:
interflood <- clusterR(all_predictions, calc, args=list(function(x){y <- 
rle(as.numeric(x) == 0; return(max(y$lengths[y$values]))}))


HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2019.06.25. 3:32 keltezéssel, Sara Shaeri via R-sig-Geo írta:

Dear community,
I’m trying to use the Calc function in a raster stack (8000 binary format 
rasters). Each raster covers 2,421,090 cells, as such, I’m using parallel 
coding to make the most of the available cores for this computation, however, 
this process is extremely slow on a 36 core and 76 Gb RAM. How could I speed up 
the calculations? This is the code I’m using:

beginCluster(30)

interflood <- clusterR(all_predictions, calc, args=list(function(x){y <- 
rle(as.numeric(x));return(max( y$lengths[y$values == 0]))}))

endCluster()


RegardsSara


-
Sara Shaeri KarimiPhD Candidate, Department of Environmental SciencesFaculty of 
Science and EngineeringMacquarie University, NSW 2109 Australia
Researchgate
Linkedin






[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Modelling: opinion of new methods

2019-06-07 Thread Bede-Fazekas Ákos

Dear Lara,

Several correlative species distribution models exist (and are used 
currently), including machine learning approaches as well, such as 
artificial neural network (ANN), boosted regression trees (RGB / GBM), 
genetic algorithm (GA), and random forests (RF).

You may find the papers comparing or grouping them interesting:
- Dormann CF, Elith J, Bacher S, Buchmann C, Carl G, Carré G, García 
Marquéz JR, Gruber B, Lafourcade B, Leitão PJ, Münkemüller T, McClean C, 
Osborne PE, Reineking B, Schröder B, Skidmore AK, Zurell D, Lautenbach S 
(2013): Collinearity: a review of methods to deal with it and a 
simulation study evaluating their performance. Ecography 36(1): 27–46.
- Elith J, Graham CH (2009)? Do they? How do they? Why do they differ? 
On finding reasons for differing performances of species distribution 
models. Ecography 32(1) 66–77.
- Heikkinen RK, Luoto M, Araújo MB, Virkkala R, Thuiller W, Sykes MT 
(2006): Methods and uncertainties in bioclimatic envelope modelling 
under climate change. Progress in Physical Geography 30(6): 751–777.
- Hernandez PA, Graham CH, Master LL, Albert DL (2006): The effect of 
sample size and species characteristics on performance of different 
species distribution modeling methods. Ecography 29(5): 773–785.
- Pearson RG, Thuiller W, Araújo MB, Martinez-Meyer E, Brotons L, 
McClean C, Miles L, Segurado P, Dawson TE, Lees DC (2006): Model-based 
uncertainty in species’ range prediction. Journal of Biogeography 
33(10): 1704–1711.
- Thuiller W (2004): Patterns and uncertainties of species’ range shifts 
under climate change. Global Change Biology 10(12): 2020–2027.


Also see package 'sdm' by Naimi and Araújo.
HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2019.06.07. 15:45 keltezéssel, Lara Silva írta:

Hello everyone!

I am new in modelling species (plants). However, I used some approaches
like ENFA (Biomapper), Maxent, generalized linear and additive models (in
R) and Biomod2.

I would like to improve my knowledge in this area.Currently, what methods
have been used?
Any suggestions?

Regards,

Lara


Sem
vírus. www.avast.com

<#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] [R] Plotting more than one regression line in ggplot

2019-06-05 Thread Bede-Fazekas Ákos

Hello,
this list is 'for discussing the development and use of R functions and 
packages for handling and analysis of spatial, and particularly 
geographical, data'. Please do not post question to r-sig-geo that are 
nothing to do with GIS.
Anyway, the ggplot() + geom_jitter() + geom_smooth() functions works 
perfectly on the data you sent (after changing the parameters of aes() 
to x and y) on my computer.

Best wishes,
Ákos

2019.06.05. 17:04 keltezéssel, rain1290--- via R-sig-Geo írta:

Hi Jeff (and everyone),

Thank you for your response and feedback. Yes, I know what you mean - it was a blind and 
quick choice to use "lm" as my object name. Unfortunately, changing the object 
name to something else does not eliminate that error/warning message. As a result, the 
same error/warning appears when running it. Oddly enough, the scatter plot is just fine - 
it's the regression line that struggles to appear. Could there be another reason for that?
Thanks, again,


-Original Message-
From: Jeff Newmiller 
To: rain1290 ; rain1290--- via R-help ; r-help 
; r-sig-geo 
Sent: Wed, Jun 5, 2019 10:49 am
Subject: Re: [R] Plotting more than one regression line in ggplot

Please read the Posting Guide... posting HTML on a plain text mailing list 
really interferes with clear communication.

If you had spent even a small amount of time working with R tutorials then you would know that "lm" is the name of a very basic, 
very important R function. However, you are defining your own object called "lm" that is very different indeed than the usual 
"lm" function. I would guess that in a clean new R workspace where you had not already run your ggplot function and assigned the 
result to your own "lm" object then this code might run. However, once you have run it once and try to run it again, your 
"method" argument gives the wrong version of "lm" to geom_smooth and you confuse it.

As the doctor said to the man pounding his own head against the wall, "If it hurts, 
don't do that." Avoid re-using important object names in R... some common names I 
see abused this way are df, data, c, t, T, and F. Your choice was unusual, but quite 
effective at illustrating the problem.

On June 5, 2019 7:21:57 AM PDT, rain1290--- via R-help  
wrote:

I am trying to plot, using ggplot, a series of scatter plots with
regression lines for several datasets. I started with the following
dataset, "onepectCO2MEDIAN". The data for this dataset is as follows:
     onepctCO2MEDIAN
     x  y
     layer.1   0.0  0.000
     layer.2   0.006794447  4.9002490
     layer.3   0.014288058  0.1608000
     layer.4   0.022087920  6.6349133
     layer.5   0.030797357 -1.2429506
     layer.6   0.038451072  1.5643374
     layer.7   0.048087904 -2.2659035
     layer.8   0.058677729  2.2070045
     layer.9   0.069261406 -2.3677001
     layer.10  0.080524530 -1.0913506
     layer.11  0.092760246  0.4099940
     layer.12  0.103789609 -0.1259727
     layer.13  0.116953168 -2.4138253
     layer.14  0.129253298  7.0890257
     layer.15  0.141710050 -0.7593539
     layer.16  0.156002052  0.0454416
     layer.17  0.170648172 -1.5349683
     layer.18  0.185318425  6.5524201
     layer.19  0.199463055 -0.8312563
     layer.20  0.213513337 -2.5099183
     layer.21  0.228839271  0.1365968
     layer.22  0.246981293 -1.3719845
     layer.23  0.263012767 -0.8712988
     layer.24  0.278505564  0.6632584
     layer.25  0.293658361  0.7938036
     layer.26  0.310747266  3.4880637
     layer.27  0.325990349 -4.4612208
     layer.28  0.342517540  0.0871734
     layer.29  0.362751633 -1.4171578
     layer.30  0.380199537 -0.9956508
     layer.31  0.394992948  0.3215526
     layer.32  0.414373398  3.1403866
     layer.33  0.430690214 -0.7376099
     layer.34  0.449738145 -2.4860541
     layer.35  0.470167458 -3.4235858
     layer.36  0.489019871  0.4824748
     layer.37  0.507242471 -0.9785386
     layer.38  0.524314284  8.5359684
     layer.39  0.543750525  5.4844742
     layer.40  0.564234197  3.2149367
     layer.41  0.583679616  3.9168916
     layer.42  0.601459444  4.4907020
     layer.43  0.619924664  6.5410410
     layer.44  0.639932007  4.8068650
     layer.45  0.661347181  8.1510170
     layer.46  0.684117317  0.2697413
     layer.47  0.704829752 -0.1807501
     layer.48  0.725045770  9.7181249
     layer.49  0.745165825  1.5406466
     layer.50  0.765016139 -1.6476041
     layer.51  0.783461511  4.8024603
     layer.52  0.806382924  4.0421516
     layer.53  0.829241335  9.3756512
     layer.54  0.849924415  5.3305050
     layer.55  0.871352434  7.5445803
     layer.56  0.893632233  6.4679547
     layer.57  0.916052133  2.8096065
     layer.58  0.938579470  5.3921661
     layer.59  0.959907651  7.2043689
     layer.60  0.981643587  3.3350806
     layer.61  1.004116774  8.8690707
     layer.62  1.028363466  1.7861299
     layer.63  1.054009140  6.2555038
     layer.64  1.072440803  7.6079236
     layer.65  1.094457805  7.6871483
     

Re: [R-sig-Geo] Creating a median curve among multiple curvesin ggplot

2019-06-01 Thread Bede-Fazekas Ákos
Hi,
median() automatically sorts the data, so the order of the rasters is 
not important, I think.
Have a nice weekend,
Ákos

2019.06.01. 19:23 keltezéssel, rain1...@aim.com írta:
> Hi Akos,
>
> Actually, I believe both of these worked just fine after testing them 
> and returned the 140 values in each case! For the second variable, I 
> first received an error after running it, but this was because the 
> number of years/layers differed slightly for one of the objects 
> (Model42), being 138 years/layers. So, I just modified the code to 
> account for this (i.e. 1:138), and it worked fine. :)
>
> Also, I need not arrange the raster objects in ascending order, 
> correct? My sense is that it automatically sorts the values for each 
> object from lowest to highest for each year for each grid cell. At 
> least that is what it did for the 1-dimensional objects for the first 
> variable, as the median values shown there are correctly ascending (as 
> they should be with each year).
>
> Many, many thanks, once again!
>
> -Original Message-
> From: Bede-Fazekas Ákos 
> To: rain1290 ; r-sig-geo 
> Sent: Sat, Jun 1, 2019 12:47 pm
> Subject: Re: [R-sig-Geo] Creating a median curve among multiple 
> curvesin ggplot
>
> Hello,
> Sorry for this, it was wrong. This might work:
> RCP1pctCO2Median <- apply(X = cbind(get, IPSL, IPSLMR, IPSL5, MIROC, 
> HadGEM, MPI, MPI5, GFDL, GFDL5), MARGIN = 1, FUN = median)
>
> OK, I understand now the structure of your rasters.
> Something like this may do the task:
> stack(lapply(X = 1:140, FUN = function(year) {calc(x = stack(lapply(X 
> = list(Model2, Model10, Model18, Model26, subset14, Model42, subset20, 
> subset24, Model60, Model68), FUN = "[[", i = year)), fun = median)}))
> (Again, this is not tested. If you send us reproducible examples then 
> we can try the code what we suggest you.)
> HTH,
> Ákos
>
> 2019.06.01. 18:06 keltezéssel, rain1...@aim.com 
> <mailto:rain1...@aim.com> írta:
> Hi Akos (and everyone),
>
> Thank you kindly for this response!
>
> I tried your suggestions, but for the first variable:
>
> RCP1pctCO2Median <- median(apply(X = cbind(get, IPSL, IPSLMR, IPSL5,
> MIROC, HadGEM, MPI, MPI5, GFDL, GFDL5)), MARGIN = 1, FUN = median)
>
> I receive this error:
>
> Error in match.fun(FUN) : argument "FUN" is missing, with no default
>
> Why would that appear?
>
> Also, for the raster objects for the second variable, yes, what you 
> suggested worked, but this returns only the single median value per 
> grid cell for the 140 years, as opposed to 140 medians (i.e. a median 
> for each year/each layer) for each grid cell. Ideally, I would like to 
> derive 140 medians corresponding to each of the 140 years/layers for 
> each grid cell. Is there a way to do this?
>
> Here is what some of the values/structure look like for the object Model2:
>
> > head(Model2) X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 [1,] 7.395703 6.493830 
> 7.432156 6.767403 11.66004 7.158040 13.968703 8.139962 7.927880 
> 10.367045 8.560315 [2,] 7.277671 6.318810 8.406366 5.974478 12.51484 
> 7.405914 15.350679 7.977879 8.039850 9.696597 9.374163 [3,] 6.919258 
> 6.119063 9.485675 6.256432 13.44547 7.542754 10.846225 8.441847 
> 8.155234 10.129576 10.074353 [4,] 6.633444 6.300559 10.349174 6.733875 
> 14.35567 7.589797 12.452223 8.384633 8.697707 10.025741 10.589373 [5,] 
> 6.986749 7.057785 11.202586 8.037964 15.08123 7.668468 11.838186 
> 8.506311 8.978721 9.66 10.256382 [6,] 7.287383 7.561100 12.091483 
> 8.384658 15.56648 7.662876 8.925544 9.431210 8.979276 10.047477 
> 10.898319 [7,] 7.957773 7.732338 13.998432 8.982379 15.50928 7.599275 
> 8.548879 8.287948 8.745722 9.496011 10.861937 X12 X13 X14 X15 X16 X17 
> X18 X19 X20 X21 X22 [1,] 5.533977 10.534669 12.136978 8.897428 
> 10.07571 8.485538 7.731084 10.557160 15.36603 12.320665 11.274374 [2,] 
> 6.010256 10.299062 12.703854 9.522287 11.08078 8.996936 6.919732 
> 10.900610 14.44405 11.949420 11.202058 [3,] 6.590125 9.553597 
> 12.781633 9.015289 11.35786 9.215069 6.888467 9.571572 13.47964 
> 10.308230 9.772387 [4,] 7.042177 8.800974 12.292124 11.457335 11.96268 
> 9.329095 7.078347 9.072567 13.90133 9.780438 9.901325 [5,] 7.613796 
> 9.617242 11.511468 11.527631 12.94864 9.510409 7.504535 8.380163 
> 14.01753 9.804405 10.389509 [6,] 8.146129 9.211740 11.250921 11.483143 
> 13.93948 8.881543 7.888670 8.917588 13.32526 8.609215 9.584101 [7,] 
> 8.551682 8.858268 10.417808 11.292067 15.19191 9.224810 8.231571 
> 7.949825 12.58279 8.042569 9.816563
>
> Thanks, once again, and I look forward to your response!
>
> -Original Message-
> From: Bede-Fazekas Ákos  
> <mailto:bfalevl...@gmail.com>
> To: r-sig-g

Re: [R-sig-Geo] Parallel processing with extract()/randomForest() in VM

2019-05-27 Thread Bede-Fazekas Ákos

Dear Asantos,

Instead of
exr<-raster::extract(r, polys)
you can use:

beginCluster(8)
exr<-raster::clusterR(x = r, fun = function (raster) {raster::extract(x 
= raster, y = polys)}, export = "polys")

endCluster()

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2019.05.28. 3:47 keltezéssel, ASANTOS via R-sig-Geo írta:

Dear R-Sig-Geo Members,

  ?? I create a virtual machine (VM) in Google Cloud with Ubuntu 18.04
with 8 CPU and 30 RAM memory and R 3.6.0 version, but I try to improve
my spatial analysis without success or same a more faster process. If I
use packages snow and doMC with all the 8 CPU's in an operation, but it
use in only 12,54% of our capacity, when the objective is user
extraction() in raster and RF with randomForest(). The gain of 18
seconds, I think that is not so good, then my question is there are any
way for improve that? In my test, I make:

# Take in the ubuntu terminal the number of processors
foresteyebrazil@superforettech1:~$cat/proc/cpuinfo|grepprocess|wc-l
8
#Packages
library(raster)
library(snow)
library(doMC)
library(randomForest)
registerDoMC()
#Take a raster for worldclim
r<-getData('worldclim', var='alt', res=5)
# 1) Use extract()/ randomForest() in Virtual Machine

start_time<-Sys.time()
# SpatialPolygons
cds1<-rbind(c(-180,-20), c(-160,5), c(-60, 0), c(-160,-60), c(-180,-20))
cds2<-rbind(c(80,0), c(100,60), c(120,0), c(120,-55), c(80,0))
polys<-spPolygons(cds1, cds2)
# Extract
exr<-raster::extract(r, polys)
tr<-ifelse(exr[[2]]<10,c("A"),c("B"))
df<-cbind(tr,exr[[2]], sqrt(exr[[2]]))
df2<-data.frame(as.factor(df[,1]),as.numeric(as.character(df[,2])),as.numeric(as.character(df[,3])))
df2<-df2[complete.cases(df2),]
colnames(df2)<-c("res1","var1","var2")
res<-NULL
for(win1:9){
mod_RF<-randomForest(x=cbind(df2$var1,df2$var2), y=df2$res1, ntree=100,
mtry=2)
res=rbind(res,cbind(w,mean(mod_RF$err.rate[,1])*100))
}
end_time<-Sys.time()
end_time-start_time
#
#Time difference of 38.72528 secs
# 2) Use extract() with snow and doMC packages in Virtual Machine

start_time<-Sys.time()
# SpatialPolygons
cds1<-rbind(c(-180,-20), c(-160,5), c(-60, 0), c(-160,-60), c(-180,-20))
cds2<-rbind(c(80,0), c(100,60), c(120,0), c(120,-55), c(80,0))
polys<-spPolygons(cds1, cds2)
# Extract
beginCluster(n=8)
exr<-raster::extract(r, polys)
tr<-ifelse(exr[[2]]<10,c("A"),c("B"))
df<-cbind(tr,exr[[2]], sqrt(exr[[2]]))
df2<-data.frame(as.factor(df[,1]),as.numeric(as.character(df[,2])),as.numeric(as.character(df[,3])))
df2<-df2[complete.cases(df2),]
colnames(df2)<-c("res1","var1","var2")
endCluster()
res<-NULL
mod_RF2<-foreach(1:9) %dopar%{
randomForest(x=cbind(df2$var1,df2$var2), y=df2$res1, ntree=100, mtry=2)
}
res=rbind(res,cbind(mean(mod_RF2$err.rate[,1])*100))
}
end_time<-Sys.time()
end_time-start_time
#
#Time difference of 20.57027 secs

Thanks in advanced,



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Reproducible results with overlay and clusterR

2019-05-27 Thread Bede-Fazekas Ákos

Dear Hugo,
please find below a solution. It might be too complicated and shorter 
solutions might exist, but it works.


library(raster)
r1 <- r2 <- raster()
r1[]<-runif(ncell(r1), 1, 100)
r2[]<-1
r2[1:3]<-2
r2[30001:35000]<-NA
plot(stack(r1,r2))


## Function
prob<-data.frame(value=1:100, r1=log(1:100), r2=max(log(1:100))-log(1:100))
f <- function(x,y,seed,...){
  ifelse(is.na(y),NA,rbinom(n=1, size=1, prob=prob[[y]][x]))
}
f <- Vectorize(f)

## Wrapper function
wrapper <- function(raster,seed,...){
    set.seed(seed)
    overlay(raster,fun=f)
}

## Overlay (reproducible)
result1<-wrapper(stack(r1,r2),seed=0)
result2<-wrapper(stack(r1,r2),seed=0)
compareRaster(result1,result2,values=TRUE)

## Cluster overlay (Not reproducible)
beginCluster(2)
result3<-clusterR(stack(r1,r2),wrapper,args=list(seed=0), 
export=c("prob", "f"))

endCluster()

beginCluster(2)
result4<-clusterR(stack(r1,r2),wrapper,args=list(seed=0), 
export=c("prob", "f"))

endCluster()

compareRaster(result3,result4,values=TRUE)

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2019.05.27. 22:59 keltezéssel, Hugo Costa írta:

Dear all
I need to implement a raster::overlay operation with clusterR (which
works), but don't figure out how to do it reproducible. How to set seeds in
the cluster? Please find a example below.
Thank you
Hugo

## Rasters
library(raster)
r1 <- r2 <- raster()
r1[]<-runif(ncell(r1), 1, 100)
r2[]<-1
r2[1:3]<-2
r2[30001:35000]<-NA
plot(stack(r1,r2))


## Function
prob<-data.frame(value=1:100, r1=log(1:100), r2=max(log(1:100))-log(1:100))
f <- function(x,y,...){
   ifelse(is.na(y),NA,rbinom(n=1, size=1, prob=prob[[y]][x]))
}
f <- Vectorize(f)

## Overlay (reproducible)
set.seed(0)
result1<-overlay(stack(r1,r2), fun=f)
set.seed(0)
result2<-overlay(stack(r1,r2), fun=f)
compareRaster(result1,result2,values=TRUE)

## Cluster overlay (Not reproducible)
set.seed((0))
beginCluster(2)
result3<-clusterR(stack(r1,r2),overlay,args=list(fun=f), export="prob")
endCluster()

set.seed((0))
beginCluster(2)
result4<-clusterR(stack(r1,r2),overlay,args=list(fun=f), export="prob")
endCluster()

compareRaster(result3,result4,values=TRUE)

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] focal analysis on raster

2019-05-23 Thread Bede-Fazekas Ákos

Dear Jeff,
1. raster::extract(Raster*, SpatialPoints*)
2. raster::extract(Raster*, rgeos::gBuffer(SpatialPoints*, width, byid = 
TRUE))
If your points are Simple Features, use sf::st_buffer(x, dist) instead 
of rgeos::gBuffer().


HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2019.05.23. 3:56 keltezéssel, Stratford, Jeff via R-sig-Geo írta:

Hi everyone,

I am looking to estimate the number of pixels to quantify different
landuses around sampling points for different projects (bird diversity,
nests, predation rates on clay caterpillars, etc). Our base map is a
Landsat 7 map.

I can plot our sample points and plot the Landsat map (using raster) but
I'm not sure

1. How to overlay the sample points onto the Landsat map
2. How to do the focal analysis using the sample points as centers in
circular buffers of 200 and 1000 m radii

Any help would be greatly appreciated.

Many thanks,

Jeff


Jeffrey A. Stratford, PhD
Department of Biology and Health Sciences &
Director of Study Abroad
Office: Cohen Science Center 210
Address: 84 W South Street
Wilkes University, PA 18766 USA
https://sites.google.com/a/wilkes.edu/stratford/home
Blog https://wordpress.com/posts/concreteornithology.blog


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] add a field to sf object and point shape in kml

2019-05-07 Thread Bede-Fazekas Ákos

Dear Marta,

I have answer only to your first question.
The way you created the new column (a$b = 2) is correct.
The order of the columns and the place of the geometry column are 
usually not so important, but if you want to rearrange the columns, just 
use one of the following lines:
a <- a[, c(colnames(a)[!(colnames(a) %in% attr(a, "sf_column"))], 
attr(a, "sf_column"))]

a <- st_sf(cbind(st_set_geometry(a, NULL), st_geometry(a)))

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences


2019.05.07. 13:54 keltezéssel, Marta Rufino írta:

Hi,

Two very simple question:

1)
What is the best way to add a variable (field) to an sf object?

# For example, if I do:
(a = st_sf(a=1, geom = st_sfc(st_point(0:1
# I would expect this would work, and it does, but then it makes some kind
of a nested structure which I cannot work with properly.
# Is this the proper way to add fields or there s another one?
a$b = 2
a
# you can see that the second field is after the geometry, which shows it
is nested.

2)
Can we change the polygon col/fill and point shape/col when exporting sf
obejcts to kml, using the function:
st_write(sf.object, " sf.object  .kml", driver='kml')


Thank you very much in advance,
Best wishes,
M.

--
My first steps in sf: https://rpubs.com/MRufino/488297

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Match coordinates with NUTS3 regions

2019-04-03 Thread Bede-Fazekas Ákos

Dear Michele,
be sure that the order of the coordinates is  the same for the two objects.
proj4string(EU_NUTS29) is starting with "+proj=longlat" (long, then 
lat), while coordinates(awards_with_coordinates_first2500) is "~g_lat + 
g_lon" (lat, then lon).

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2019.04.04. 1:47 keltezéssel, Michele Molteni írta:

Hi everyone,

I have a .dta file with coordinates (WGS 84) and I would like to match them
to NUTS3 regions. I followed indications from a previous post but I get no
match even though the coordinates fall within NUTS3 bboxes. Could you
please be so kind to help me understand what I am doing wrong? This is what
I have so far:


library(rgdal)

## Download Administrative Level data from EuroStat

temp <- tempfile(fileext = ".zip")
download.file("

https://ec.europa.eu/eurostat/cache/GISCO/distribution/v2/nuts/download/ref-nuts-2013-60m.shp.zip;,
temp)
trying URL '
https://ec.europa.eu/eurostat/cache/GISCO/distribution/v2/nuts/download/ref-nuts-2013-60m.shp.zip
'
Content type 'application/zip' length 2330060 bytes (2.2 MB)
downloaded 2.2 MB

unzip(temp)

##Read data after having unzipped it
EU_NUTS <- readOGR(dsn = "C:/Users/User/Documents/TED_project/Replicating

paper/Shapefile", layer="NUTS_RG_60M_2013_4326_LEVL_3")
OGR data source with driver: ESRI Shapefile
Source: "C:\Users\User\Documents\TED_project\Replicating paper\Shapefile",
layer: "NUTS_RG_60M_2013_4326_LEVL_3"
with 1480 features
It has 5 fields

##Consider only EU29 countries (1361 NUTS3)
EU_NUTS29 <- EU_NUTS[which(EU_NUTS@data$CNTR_CODE %in% c('BE', 'BG',

'CZ', 'DK', 'DE', 'EE', 'IE', 'EL', 'ES', 'FR', 'HR', 'IT', 'CY', 'LV',
'LT', 'LU', 'HU', 'MT', 'NL', 'AT', 'PL', 'PT', 'RO', 'SI', 'SK', 'FI',
'SE', 'UK', 'NO')), ]

##Import data contaning lat and lon and consider first 2500 observations
awards_with_coordinates_first2500 <- read_dta("~/TED_project/Replicating

paper/awards_with_coordinates_first2500.dta")

awards_with_coordinates_first2500 <- awards_with_coordinates_first2

500[(1:2500),]

## Here is the projection for map_nuts data
proj4string(EU_NUTS29)

[1] "+proj=longlat +datum=WGS84 +no_defs +ellps=WGS84 +towgs84=0,0,0"

##Transform g_lat and g_lon in numeric
awards_with_coordinates_first2500$g_lat <- as.numeric(as.character(awards

_with_coordinates_first2500$g_lat))

awards_with_coordinates_first2500$g_lon <- as.numeric(as.character(awards

_with_coordinates_first2500$g_lon))

## Make a spatial points dataframe from point_data
## Assuming the projection to be longlat on the WGS84 datum
## First define the coordinates
coordinates(awards_with_coordinates_first2500) <- ~g_lat + g_lon

## Then the projection
proj4string(awards_with_coordinates_first2500) <- CRS("+proj=longlat

+datum=WGS84")

## Now transform the projection of awards_with_coordinates_first2500 to

that of EU_NUTS29

awards_with_coordinates_first2500 <- 
spTransform(awards_with_coordinates_first2500,

proj4string(EU_NUTS29))

## Use the over() function to get the regions for the points
over(awards_with_coordinates_first2500,EU_NUTS29,by.id=)

 NUTS_ID LEVL_CODE CNTR_CODE NUTS_NAME  FID
1  NA 
2  NA 

200NA 
  [ reached 'max' / getOption("max.print") -- omitted 2300 rows ]

bbox(awards_with_coordinates_first2500)

 min  max
g_lat 34.794891 51.46780
g_lon -3.018564 33.63785

bbox(EU_NUTS29)

   minmax
x -61.806 55.826
y -21.350 71.127

##The points seem to be all in Africa
plot(EU_NUTS29)
plot(awards_with_coordinates_first2500, add = TRUE, col = "red", pch = 19)

Thanks in advance.

Cheers,
Michele.

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Simple choropleth using ggplot2

2019-01-17 Thread Bede-Fazekas Ákos

Dear Richard,
although I have problem with your mmr.zip (R cannott unzip it), and 
could not try your code, I  suggest to migrate from sp to sf. ggplot can 
easily use Simple Features data.
Use function sf::st_as_sf() to convert your Spatial* data to Simple 
Features.

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2019.01.17. 3:19 keltezéssel, Richard Sherman írta:

Hello all,

I am trying to plot a simple choropleth, something I’ve done a while ago using 
rworldmap and also (if I recall correctly) ggplot2, but I am failing to draw 
the map at all and failing (I think) to merge my data properly with the 
shapefile. Thank you for help with a basic question. I’d like to know what is 
wrong with what I’m doing here.

My R script is

library(ggplot2)
library(rgdal)
library(plyr)

# get shapefile for world map
download.file("https://opendata.arcgis.com/datasets/252471276c9941729543be8789e06e12_0.zip;,
 destfile = "countries.zip”)

# get world bank maternal mortality data
download.file("http://api.worldbank.org/v2/en/indicator/SH.STA.MMRT?downloadformat=csv;,
 destfile = "mmr.zip”)

# get csv file with concordance between ISO-2-alpha and ISO-3-alpha country 
codes
download.file("https://raw.githubusercontent.com/rsspdx/mmr/master/iso_2_iso_3.csv;, 
destfile = "iso_2_iso_3.csv”)

# unzip the zipped files
mmr.files <- unzip("mmr.zip")
unzip("countries.zip”)

# read in maternal mortality data and fix it up
mmr.data <- read.csv(mmr.files[2], skip = 3, stringsAsFactors = FALSE)
mmr.data.name <- mmr.data$Country.Name
mmr.data.code <- mmr.data$Country.Code
mmr.data.mmr <- mmr.data$X2015
mmr.data.df <- as.data.frame(cbind(mmr.data.name, mmr.data.code, mmr.data.mmr))
names(mmr.data.df) <- c("Country.Name", "Country.Code", "mmr”)

# read in the shapefile
world.map <- readOGR(dsn=".", layer = "UIA_World_Countries_Boundaries")

# - possibly I should be doing this 
#
# world.map@data$id <- rownames(world.map@data)
# world.map.df <- fortify(world.map)
#
# ---

#--or perhaps I need to merge the data into a data slot of the shapefile
#--but I can’t recall (or never knew?) how to do that

# get ISO2 country codes
iso_2_iso_3 <- read.csv("iso_2_iso_3.csv”)

# ISO2 in this file is called ISO in the shapefile, create ISO variable
# then merge into mmr.data
iso_2_iso_3$ISO <- iso_2_iso_3$ISO2
mmr.data.df <- merge(iso_2_iso_3, mmr.data.df, by.x="ISO3", by.y="Country.Code”)

# merge maternal mortality data into shapefile
mmr <- merge(world.map, mmr.data.df, by = "ISO")
mmr <- fortify(mmr)
str(mmr)

# -create a map, not working
map <- ggplot(data = mmr, aes(x = long, y = lat, group = group))

# -look at the map, obviously not working
map + geom_polygon(fill = mmr$mmr)


---
Richard Sherman
rss@gmail.com

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] clip a raster layer to a mask layer using raster algebra in R?

2018-12-06 Thread Bede-Fazekas Ákos

Dear Christian,
raster::mask() and raster::crop() may solve your problem.
https://www.rdocumentation.org/packages/raster/versions/2.8-4/topics/mask
https://www.rdocumentation.org/packages/raster/versions/2.8-4/topics/crop

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2018.12.06. 9:40 keltezéssel, Christian Willmes írta:

Sorry, I had an error in the main command.

It should be:

gebco_clipped <- ifelse(RSLmask, gebco, NA)

But it still gives the same error.

Best,
Christian

Am 06.12.18 um 09:37 schrieb Christian Willmes:

Hello,

this should be normally a very basic simple one-liner, but I can't 
get my head around it. I only find solutions for cliping a raster 
against a vector, but none for clipping a raster against a raster mask.


I want to clip a layer to a mask. In GRASS GIS r.mapcalc the syntax is:

result = if(RSLmask, gebco, null())

This clips the GEBCO dataset to the extent of the given raster mask 
'rslmask'. The result raster has heigt values within the raster mask 
extent, the rest of the raster layer has NULL or NA values.


In R I did the following, but it does not work as expected:

gebco <- raster(gebco2014) #load raster

RSLmask <- gebco >= 0  #create mask layer

RSL <- ifelse(RSLmask, 1, NA)

The last command gives an error about vector type 'S4'?!

>In is.na(test) :
  >is.na() auf nicht-(Liste oder Vektor) des Typs 'S4' angewendet

So, in short: what is the R way to simply clip a raster to a raster 
mask?


Thanks and best regards,
Christian

PS: If found many solutions for clipping a raster to a vector 
polygon, but this is not what I want.




___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Extract coordinates from rasterbrick

2018-08-14 Thread Bede-Fazekas Ákos

Dear Milu,

I think that you are looking for as.data.frame(x, xy = TRUE).

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences


2018.08.14. 17:03 keltezéssel, Miluji Sb írta:

Dear all,

I have the following rasterbrick (x);

class   : RasterBrick
dimensions  : 112, 272, 30464, 7305  (nrow, ncol, ncell, nlayers)
resolution  : 0.25, 0.25  (x, y)
extent  : -132, -64, 24, 52  (xmin, xmax, ymin, ymax)
coord. ref. : +proj=longlat +datum=WGS84 +ellps=WGS84 +towgs84=0,0,0
data source : /projectnb/climpct/TEX_1986_2005.nc
names   : X1986.01.01, X1986.01.02, X1986.01.03, X1986.01.04,
X1986.01.05, X1986.01.06, X1986.01.07, X1986.01.08, X1986.01.09,
X1986.01.10, X1986.01.11, X1986.01.12, X1986.01.13, X1986.01.14,
X1986.01.15, ...
Date: 1986-01-01, 2005-12-31 (min, max)
varname : tex


I can obtain the values by;

as.data.frame(getValues(x))

However, I would like to extract the values by all coordinates in the file
in the form of a dataframe. Is it possible to do so? Any help will be
greatly appreciated. Thank you.

Sincerely,

Milu

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] raster::clump not working?

2018-05-25 Thread Bede-Fazekas Ákos

Dear Joao,
I think function clump() works as it should be. A toy example:
x <- raster(matrix(sample(x = c(NA, NA, NA, 1), size = 36*18, replace = 
TRUE), ncol=36, nrow=18))

plot(x)
plot(clump(x))

Maybe you are searching for a function that aggregate those cells that 
have the same value?
In this case it would do what you want (I'm sure there is a more 
straightforward way...):

r <- raster(ncols=36, nrows=18)
x <- init(r, fun='cell')
y <- layerize(x)
ids <- calc(x = y, fun = function(layers) 
{return(which(as.logical(layers))[1])})

plot(ids)

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences


2018.05.25. 12:00 keltezéssel, João Carreiras írta:

Hi!

I've been trying to run the clump command but the output is consistently a
raster with values == 1. Please find below an example.

I'm sure I'm doing something stupid. However, the command is really
straightforward and I can't seem to identify what the problem might be.

Any help really appreciated.
Thanks
Joao

r <- raster(ncols=36, nrows=18)
x <- init(r, fun='cell')
x
class   : RasterLayer
dimensions  : 18, 36, 648  (nrow, ncol, ncell)
resolution  : 10, 10  (x, y)
extent  : -180, 180, -90, 90  (xmin, xmax, ymin, ymax)
coord. ref. : +proj=longlat +datum=WGS84 +ellps=WGS84 +towgs84=0,0,0
data source : in memory
names   : layer
values  : 1, 648  (min, max)
a <- clump(x)
a
class   : RasterLayer
dimensions  : 18, 36, 648  (nrow, ncol, ncell)
resolution  : 10, 10  (x, y)
extent  : -180, 180, -90, 90  (xmin, xmax, ymin, ymax)
coord. ref. : +proj=longlat +datum=WGS84 +ellps=WGS84 +towgs84=0,0,0
data source : in memory
names   : clumps
values  : 1, 1  (min, max)

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] how to plot different rows of a SpatialPolygonsDataFrame in trellis panels

2018-05-23 Thread Bede-Fazekas Ákos

Dear Scott,

use
spplot(tmp, zcol="z")
instead of
spplot(tmp, zcol=z)
to display the column "z".

Use
spplot(tmp[tmp$id == 1, ], zcol = "z")
to display only those grids that has id 1.

And to display two panels:
tmp$z1 <- tmp$z2 <- tmp$z
tmp$z1[tmp$id == 1] <-  NA
tmp$z2[tmp$id == 2] <-  NA
spplot(tmp, zcol=c("z1", "z2"))

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences


2018.05.23. 21:50 keltezéssel, Waichler, Scott R írta:

Roger,


Without having tried your code, how similar is this to these packages:

https://cran.r-project.org/package=micromap
https://cran.r-project.org/package=micromapST

Thanks for responding.  I looked at the micromap vignettes but it seems their 
focus is being able to provide small maps in one column in figures where the 
main emphasis is on statistics presented in other columns.  In my problem, all 
I want to show are maps.  I haven't been able to do this with a higher-level 
function like spplot or stplot.  I can imagine doing it with a custom panel 
function, where lpolygons() is used with input that depends on the panel.

In case it helps, I should clarify that the "different polygons" I referred to are 
polygons with different spatial coordinates; the attribute values are a small number of discrete 
values.  My polygons are representations of contaminant plumes in groundwater.  For each year, I 
have read in a shapefile containing polygons of the contaminant concentration at 2 to 4 levels.  I 
have combined all of these data into a SpatialPolygonsDataFrame, where each row represents a given 
year and concentration level, and contains one or more polygons.  The coordinates are different in 
each row, but years and concentration level are common across rows.  This seems to match the nature 
of a "long table" format in spacetime, but I haven't been able to work it out in that 
package, hence I'm stepping back and trying to get the basics working with spplot.

Thanks,
Scott


Hello,

I have a SpatialPolygonsDataFrame.  I would like to do a trellis plot on one

of the attributes, so that in the panel for a given attribute value, only those
polygons with that value are plotted.  So, each panel has different polygons
plotted in it.  I can't figure out how to do this.  In the toy example below, I
would like to create a trellis plot with one panel showing the polygons with id
= 1, and another panel showing the polygons with id = 2.

My goal beyond this toy problem is to do the same thing with stplot, where

panels correspond to times and each time has a different set of polygons
plotted.  Will that be possible?  In all the examples I can find of using stplot
for a space-time grid with the spatial objects being polygons, the polygons
are the same across time.

# based on example in help("SpatialPolygonsDataFrame-class")
Sr1 = Polygon(cbind(c(2,4,4,1,2),c(2,3,5,4,2)))
Sr2 = Polygon(cbind(c(5,4,2,5),c(2,3,2,2)))
Sr3 = Polygon(cbind(c(4,4,5,10,4),c(5,3,2,5,5)))
Sr4 = Polygon(cbind(c(5,6,6,5,5),c(4,4,3,3,4)), hole = TRUE)
Srs1 = Polygons(list(Sr1), "s1")
Srs2 = Polygons(list(Sr2), "s2")
Srs3 = Polygons(list(Sr3, Sr4), "s3/4") SpP =
SpatialPolygons(list(Srs1,Srs2,Srs3), 1:3) grd <- GridTopology(c(1,1),
c(1,1), c(10,10)) polys <- as(grd, "SpatialPolygons") centroids <-
coordinates(polys) x <- centroids[,1] y <- centroids[,2] z <- 1.4 +
0.1*x + 0.2*y + 0.002*x*x id = factor(sample(c(1,2),
size=length(polys), replace=T)) tmp <- SpatialPolygonsDataFrame(polys,
  data=data.frame(x=x, y=y, z=z, id=id,
row.names=row.names(polys)))
plot(tmp)  # plots all the square polygons (n = 10*10)
spplot(tmp)  # plots values of x, y, z, id in separate panels, each
with 100 polys spplot(tmp, zcol=z)  # error message about duplication
of factor level spplot(tmp ~ id, zcol=z, data=tmp)  # won't take
formula

Thank you,
ScottWaichler
Pacific Northwest National Laboratory
scott.waichler _at_ pnnl.gov

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


--
Roger Bivand
Department of Economics, Norwegian School of Economics, Helleveien 30, N-
5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: roger.biv...@nhh.no
http://orcid.org/-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0J=en

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Help

2018-05-14 Thread Bede-Fazekas Ákos

Dear Soufianou,
this is just a framework. Let's say that you have a vector ('variables') 
containing the name of the environmental variables.


library(raster)
library(dismo)
library(corrplot)

for (variable in variables) {
    assign(variable, raster(paste0(variable, ".asc"))
}
environment <- brick(variables)
environment_standardized <- data.frame(scale(x = 
as.data.frame(environment), center = TRUE, scale = TRUE))


correlation_matrix <- cor(environment_standardized, use = "na.or.complete")
corrplot(corr = correlation_matrix)
VIF <- vif(environment_standardized)
CN <- kappa(na.omit(environment_standardized), exact = TRUE)
# You can select variables that fulfill your criteria about correlation 
structure

selected_variables <- variables[c()] # subsetting

maxent(x = environment[[selected_variables]] , p = presence_points)

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2018.05.14. 12:28 keltezéssel, Soufianou Abou via R-sig-Geo írta:

Dear Rolf Turner,

  I have points of presence of cowpea in Niger in CSV format; in addition to 
other variables (soil texture, soil pH, altitude, I downloaded from worldclim 
archives, the 19 environmental variables, I cut them all at the Niger scale and 
I converted them under ASCUI format. The idea for me is to choose the best 
variables to include in the model.
  NB. I'm using Maxent model, but I'm not good in R software.

Merci














SADDA Abou-Soufianou

--

Doctorant

Université Dan Dicko Dankoulodo deMaradi-Niger

BP 465 120, avenue MamanKoraou- ADS

    &

Institut d’Ecologie et des Sciencesde l’Environnement de Paris (iEES-Paris)

Centre IRD France Nord-(iEES Paris)-32,av.Henri Varangnat 93143 BONDY cedex.

|
Lien: https://ieesparis.ufr918.upmc.fr/index.php?page=fiche=378=1


  abousoufia...@gmail.com

GSM : Niger : (+227) 96-26-99-87/91-56-35-19 ; France (+ 33)  07-55-79-39-93

  
   |
  
   |











 Le lundi 14 mai 2018 à 12:05:40 UTC+2, Rolf Turner 
 a écrit :



Please keep your posts "on-list".  You are much more likely to get a
useful answer that way.  There are many others on the list whose
knowledge and insight are far greater than mine.

I have therefore cc-ed the list in this reply.

On 14/05/18 21:48, Soufianou Abou wrote:


Thank you for advice, Rolf Turner

My question is as follows:

I'd use maxent to model the potential distribution of cowpea on the
basis of the only presence data. Indeed, I have acquired a number of
environmental variables and bioclimatic regarding my area of study. But
to choose the most contributive variables in the model; I would like to
make a correlation analysis of these. On this, could you explain to me
the step by step procedures to follow in R? I would like to say scripts
for:- compile and call all environmental variables;- run the correlation
test to select the least correlated ones.

As I said before, I don't think this is the right approach, but I can't
be sure without knowing more about your data.  I find your description
to be vague.

How are your data stored?  What information do you have about the
"distribution of cowpea".  Do you have *points* where cowpea is present
or more extensive *regions* where it is present?  (And could these
regions be "considered to be points" on the scale of interest?) How are
your predictors stored?  Are the values of these predictors known at
every point of your study area?  Can you show us a bit of your data (use
the function dput() to include *a small sample* of your data in the body
of your email).

If you insist on mucking about with correlation and testing, perhaps the
function cor.test() will give you what you want.  I reiterate however
that this seems to me to be a wrong approach.

cheers,

Rolf Turner



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Spatial Downscaling in R

2018-03-15 Thread Bede-Fazekas Ákos

Dear Milu,

You can use any spatial interpolation method as statistical downscaling 
approach. See package gstat for IDW (inverse distance weighted) and 
several types of kriging. In case of temperature, you might use 
elevation data from a DEM or radiation data as auxiliary variables in 
those interpolation methods (e.g. regression kriging) that can handle 
auxiliary data.


HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences


2018.03.14. 22:15 keltezéssel, Miluji Sb írta:

Thanks again! Looking into this right now.

Sincerely,

Milu

On Wed, Mar 14, 2018 at 9:54 PM, Michael Sumner  wrote:


Try ClimDown package, otherwise more generally raster function
disaggregate.

Cheers

On Thu, 15 Mar 2018, 06:53 Miluji Sb,  wrote:


Dear all,

Please forgive my inexperience with spatial downscaling. I am interested
in
spatial downscaling of global temperature to grid cell. Is there a package
in R that can perform this function?

Any help/guidance will be highly appreciated.

Sincerely,

Milu

 [[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


--
Dr. Michael Sumner
Software and Database Engineer
Australian Antarctic Division
203 Channel Highway

Kingston Tasmania 7050 Australia




[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Autologistic regression in R

2017-11-14 Thread Bede-Fazekas Ákos
Hi Erin,

Although I'm not familiar with autologistic regression, the task you 
outlined seems to be a Species Distribution Modeling (SDM) problem for 
which several methods are available in R. So I'm not sure if you need 
only and exclusively autologistic regression. The SDM methods can solve 
the equation /pres = f(env1, env2, ...)/, where /pres /is 
presence/absence of the species and/envN/s are environmental predictors. 
See packages "/dismo/" and "/biomod2/", and also /vignette("sdm")/. Some 
of the methods need data.frame or SpatialPointsDataFrame, other ones 
need Raster* objects as input.

HTH,
�kos Bede-Fazekas
Hungarian Academy of Sciences

2017.11.15. 7:04 keltez�ssel, Elias T Krainski �rta:
> Hi,
>
> Have you tried http://leg.ufpr.br/Rcitrus ?
>
> Elias
>
> On 15/11/17 04:46, Mingke Li wrote:
>> Hi,
>>
>> I am new to autologistic regression and R. I do have questions when 
>> starting a project in which I believe autologistic regression (spdep 
>> package) is needed.
>>
>> I have a point layer whose attribute table stores the values of the 
>> dependent variable (population of a kind of insect), all the 
>> independent variables (environmental factors), and the associated 
>> latitude and longitude. I hope to to fit an autologistic model to 
>> analyze which factors or combinations of factors have effects on the 
>> presence/absence of the insect (1 or 0).
>>
>> I found other papers which applied autologistic regression in their 
>> study almost used a grid system and defined their window sizes. So, 
>> my question is do I have to convert my point layer into a grid system 
>> if I want to do this analysis with R?
>>
>> Also, what should I consider when I generate the grid system? How to 
>> determine a proper cell size? How about the searching window 
>> (neighbourhood) size?
>>
>> Many Thanks.
>>
>> Erin
>>
>>
>> [[alternative HTML version deleted]]
>>
>> ___
>> R-sig-Geo mailing list
>> R-sig-Geo@r-project.org
>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: [R-sig-Geo] Error in spCbind(GA0, data) : row names not identical

2017-10-23 Thread Bede-Fazekas Ákos
Hi Marvin,

I think it should be better to post more information on the error, which 
could help us to help you. And while you try to get these additional 
infos, you might find the solution to your problem. I think, running
row.names(GA0)
row.names(data)
all.equal(row.names(GA0), row.names(data))
row.names(GA0)[row.names(GA0) != row.names(data)]
row.names(data)[row.names(GA0) != row.names(data)]
may help you to start searching the cause of the error. (Which is simply 
connected to the spelling of "De Kalb", and has nothing to do with 
spatial data and R...)
 > row.names(GA0)[row.names(GA0) != row.names(data)]
[1] "de kalb" "decatur"
 > row.names(data)[row.names(GA0) != row.names(data)]
[1] "decatur" "dekalb"

Have a nice day,
Ákos


2017.10.22. 14:48 keltezéssel, Marvin Sharma írta:
> HI,
> Could some one help  this newbie in fixing the spCbind problem 
> encountered below?
> >names(data)
>  "county"    "y"        "x"         "Latitude"  "Longitude"
>
> GA <- map("county", "georgia", plot=FALSE, fill=TRUE)
> cntys <- tolower(as.character(data$county))
> GA$names <- substring(GA$names, 9, nchar(GA$names))
> match(cntys, GA$names)##It does  match!
> ###
> GA <- map2SpatialPolygons(GA, GA$names)
> proj4string(GA) <- CRS("+proj=longlat")
> row.names(data) <- cntys
> GA0 <- SpatialPolygonsDataFrame(GA, 
> data=data.frame(ID=1:159,row.names=row.names(GA)))
> GA1 <- spCbind(GA0, data)
> > GA1 <- spCbind(GA0, data)
> Error in spCbind(GA0, data) : row names not identical
> >
> Thanks n advance,
>
> Marvin


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

[R-sig-Geo] Fwd: Re: spplot help, please

2017-10-22 Thread Bede-Fazekas Ákos
E-mail of Marvin Sharva, accidentally addressed only to me, is forwarded 
below.
Ákos


 Továbbított üzenet 
Tárgy:  Re: [R-sig-Geo] spplot help, please
Dátum:  Sat, 21 Oct 2017 13:57:04 + (UTC)
Feladó: Marvin Sharma <marvinsha...@yahoo.com>
Válaszcím:  Marvin Sharma <marvinsha...@yahoo.com>
Címzett:    Bede-Fazekas Ákos <bfalevl...@gmail.com>



Dear Bede-Fazekas, Edzer, and Florian, and the list,

The following code produces the plot without the state map of georgia. I 
got some help from the old thread answered few years ago, but could not  
   match the state names and counties names, see code below.

library(spgwr)
library(spdep)
library(maps)
library(maptools)
data<-read.csv("C:/Users/md_sh/Desktop/try.csv",header=TRUE)
map = SpatialPointsDataFrame(data=data, 
coords=cbind(data$Longitude,data$Latitude))
GA.adapt.gauss <- gwr.sel(y~x,data=map,adapt=TRUE)
res.adpt <- gwr(y~x,data=map,adapt=GA.adapt.gauss)
res.adpt$SDF$ols.e <- residuals(lm(y~x,data=map))
spplot(res.adpt$SDF, c("ols.e", "gwr.e"),main="Residuals from OLS and GWR")
#This provides plot without the state map! sorry!
GA <- map("county", "georgia", plot=FALSE, fill=TRUE)
cntys <- tolower(as.character(data$county))
GA$names <- substring(GA$names, 14, nchar(GA$names))
match(cntys, GA$names)##It does not match!

Thanks much,
Marvin



On Saturday, October 21, 2017 9:39 AM, Marvin Sharma 
<marvinsha...@yahoo.com> wrote:


Hello Bede-Fazekas,
This code provides the residuals, and residual plot without the state 
map I want the residuals withing the Georgia state map.

Could you see where I need to make change?

Marvin


On Saturday, October 21, 2017 1:53 AM, Bede-Fazekas Ákos 
<bfalevl...@gmail.com> wrote:


Hi Marvin,

this code works fine just as it should work. Which "state map" you need?
A basemap with terrain/roads/cities/borders? In this case there are
several possibilities to do that. Have a look at these links, they may
solve your problem:
https://pakillo.github.io/R-GIS-tutorial/#gmap
http://www.nickeubank.com/wp-content/uploads/2015/10/RGIS3_MakingMaps_part1_mappingVectorData.html
 

(Section "4.2 Basemaps with spplot")

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences


2017.10.21. 3:29 keltezéssel, Marvin Sharma via R-sig-Geo írta:
 > Hi,
 > I wanted to plot the spplot of my model residuals. The plot works
 > fine, but the state map does not appear. Code is given below and the
 > data are attached. I would greatly appreciate any help in this.
 >
 > Marvin
 >
 >
 > library(spgwr)
 > library(spdep)
 > library(maps)
 > library(maptools)
 > data<-read.csv("try.csv",header=TRUE)
 > coords=cbind(data$Longitude,data$Latitude)
 > g.adapt.gauss <- gwr.sel(y~x, data,coords,adapt=TRUE)
 > res.adpt <- gwr(y~x, data,coords, adapt=g.adapt.gauss)
 > brks <- c(-0.25, 0, 0.01, 0.025, 0.075)
 > cols <- grey(5:2/6)
 > res.adpt$SDF$ols.e <- residuals(lm(y~x, data))
 > spplot(res.adpt$SDF, c("ols.e","gwr.e"),main="Residuals")
 >
 >
 >
 > ___
 > R-sig-Geo mailing list
 > R-sig-Geo@r-project.org <mailto:R-sig-Geo@r-project.org>
 > https://stat.ethz.ch/mailman/listinfo/r-sig-geo


     [[alternative HTML version deleted]]


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org <mailto:R-sig-Geo@r-project.org>
https://stat.ethz.ch/mailman/listinfo/r-sig-geo





[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: [R-sig-Geo] spplot help, please

2017-10-20 Thread Bede-Fazekas Ákos
Hi Marvin,

this code works fine just as it should work. Which "state map" you need? 
A basemap with terrain/roads/cities/borders? In this case there are 
several possibilities to do that. Have a look at these links, they may 
solve your problem:
https://pakillo.github.io/R-GIS-tutorial/#gmap
http://www.nickeubank.com/wp-content/uploads/2015/10/RGIS3_MakingMaps_part1_mappingVectorData.html
 
(Section "4.2 Basemaps with spplot")

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences


2017.10.21. 3:29 keltezéssel, Marvin Sharma via R-sig-Geo írta:
> Hi,
> I wanted to plot the spplot of my model residuals. The plot works 
> fine, but the state map does not appear. Code is given below and the 
> data are attached. I would greatly appreciate any help in this.
>
> Marvin
>
>
> library(spgwr)
> library(spdep)
> library(maps)
> library(maptools)
> data<-read.csv("try.csv",header=TRUE)
> coords=cbind(data$Longitude,data$Latitude)
> g.adapt.gauss <- gwr.sel(y~x, data,coords,adapt=TRUE)
> res.adpt <- gwr(y~x, data,coords, adapt=g.adapt.gauss)
> brks <- c(-0.25, 0, 0.01, 0.025, 0.075)
> cols <- grey(5:2/6)
> res.adpt$SDF$ols.e <- residuals(lm(y~x, data))
> spplot(res.adpt$SDF, c("ols.e","gwr.e"),main="Residuals")
>
>
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: [R-sig-Geo] How to draw the same legend (one legend) for the multiple spatial figures?

2017-08-06 Thread Bede-Fazekas Ákos

Dear Brandon and Wei,
I don't know the answer to your question but a somewhat similar solution 
is when you plot a separate legend using lattice::draw.key() and disable 
sp::spplot()'s built-in legend using argument "auto.key = FALSE".

An example:

# loading libraries
library(lattice)
library(sp)
library(gridExtra)
library(grid)

# drawing legend
no_of_categories <- 10
cutpoints <- seq(from = 0 - 0.0001, to = 1 + 0.0001, length.out = 
no_of_categories + 1)
legend_text <- apply(X = cbind(round(cutpoints [1:no_of_categories], 1), 
round(cutpoints [2:(no_of_categories + 1)], 1)), MARGIN = 1, FUN = 
function(number) {return(paste(format(x = number, digits = 2), collapse 
= " - "))})
legend_colors <- colorRampPalette(c("red", "orange", "yellow", 
"lightgreen", "darkgreen"))(no_of_categories)
legend <- draw.key(key = list(reverse.rows = TRUE, space = "right", 
rectangles = list(col = legend_colors, border = FALSE), text = 
list(legend_text)), draw = FALSE)

dev.off()

# drawing maps
maps <- list()
for (column_name in colnames(sp_object@data)) {
map <- spplot(obj = sp_object, zcol = column_name, auto.key = 
FALSE, col.regions = legend_colors, cuts = cutpoints)

maps <- append(maps , list(map))
}
maps <- append(maps , list(legend))

# plotting the map-legend composite to a png file
layout <- rbind(c(1,2), c(3,4), c(5,6)) # let's say we have 5 maps and a 
legend
composite <- do.call(arrangeGrob, c(maps, list(layout_matrix = layout), 
list(widths = c(1, 1

png(width = 1000, height = 1000, filename = "map.png"))
grid.newpage()
grid.draw(composite)
dev.off()

Hope this helps,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2017.08.06. 11:53 keltezéssel, Brandon Payne írta:

How to draw the same legend (one legend) forthe
   multiple spatial figures?

I would put the legend next to the upper-most plot and
wrap the whole thing in a single

/figure

so that it would be more obvious that the same legend applied to all.

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] modélisation de la distribution spatiale du niébé

2017-04-28 Thread Bede-Fazekas Ákos
Cher Sadda,

vous avez écrit que "J'ai voulu utilisé le logiciel Maxent pour le 
faire; mais j'ai compris à travers la littérature que le logiciel R est 
de nos jours largement utilisé.", mais vous pouvez utiliser MaxEnt via 
le logiciel R:
install.packages("dismo")
library(dismo)
m <- maxent(x = tableau_des_prédicteurs_environnementaux, p = 
points_de_présence)

Vous devez d'abord télécharger le programme à partir d'ici:
http://biodiversityinformatics.amnh.org/open_source/maxent/

J'espère que cela t'aides,
Ákos Bede-Fazekas
Académie hongroise des sciences


2017.04.28. 15:38 keltezéssel, Soufianou Abou via R-sig-Geo írta:
> Bonsoir chers, j'aimerai modéliser la distribution spatiale de la niche 
> écologique du niébé , je dispose d'un jeu de données (variables 
> environnementales et bioclimatiques) et des données de présence des variétés 
> du niébé.J'ai voulu utilisé le logiciel Maxent pour le faire; mais j'ai 
> compris à travers la littérature que le logiciel R est de nos jours largement 
> utilisé. C'est dans cette optique que j'aimerai m'y plancher là dessus. Au 
> fait, j'ai besoin de l'aide pour comprendre les procédures pour faire cette 
> modélisation à l'aide de R.   1. J'aimerai faire Un test de normalité de 
> Shapiro-Wilks, dont l’hypothèse nulle consiste à affirmer que les données 
> suivent une distribution normale et où l'hypothèse alternative implique une 
> distribution asymétrique ou anormale, permet de conclure que les observations 
> sont de distribution très asymétrique. 2. J'aimerai faire une transformation 
> logarithmique des données réponses est donc effectuée afin de réduire 
> l’asymétrie des données et, ainsi, d’améliorer la puissance des tests 
> statistiques.
> 3. J'aimerai faire une analyse de redondance canonique  sur l’ensemble des 
> tableaux de données .4.J'aimerai faire  une sélection progressive 
> bidirectionnelle  afin  de choir les variables pertinentes en fonction du 
> critère d’information d’Akaike, ou AIC et de la représentativité (p-value) 
> .5. J'aimerai faire  une sélection progressive par ajout, en fonction du 
> R-carré ajusté,  afin de comparer le résultat de la sélection
> 6. En fin les procédure à suivre pour modéliser la distribution des variétés.
> merci cordialement.
> SADDA Abou-Soufianou -- 
> DoctorantUniversité Dan Dicko Dankoulodo de Maradi,120 avenue Maman 
> Koraou,ADS, MaradiBP 465 Maradi/Niger E-mail: a.soufianou@yahoo.frGSM : 
> (+227)96269987
>   [[alternative HTML version deleted]]
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: [R-sig-Geo] Problem with operations on large files ("cannot allocate vector of size xx")

2017-02-01 Thread Bede-Fazekas Ákos

Dear Kamil,

Have you tried this?:
memory.limit(1) # increase the memory limit of R from the default 
4006 MB to 1MB

See details:
https://stat.ethz.ch/R-manual/R-devel/library/base/html/Memory-limits.html

I have 64 bit Windows with 64 bit R, but without memory.limit() I get 
similar error when working with large data.


HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2017.02.01. 13:19 keltezéssel, Kamil Konowalik írta:

Hi All,
I have a problem with operations on large datasets. I have a several raster 
files (43) and I would like to make a PCA on them.
I'm using those simple commands:

files <- list.files("G:/GIS/raster_TIFF",pattern='tif',full.names=TRUE)
s <- raster::stack(files)
pca<-princomp(na.omit(values(s)), cor=TRUE)

But after the last one I'm getting an error: "cannot allocate vector of size 6.6 
Gb".
I searched the internet and there are some advices but I couldn't found anything helpful. 
I'm using Windows 7 64 bit with 8 GB of RAM so it is not an issue with 32 bit. Also I 
would prefer to find a solution which can use my current resources rather than buying 
additional RAM. I thought that maybe "bigmemory" package will be a solution but 
I couldn't make it work. I tried also different file types (ASCII, RASTER, GTiff) but 
that doesn't help neither.
It should be a simple and a common problem but I can't figure answer on my own 
so if you have any working solution I would appreciate it.
Thanks in advance!
Best regards,
Kamil

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Add layers to a lattice plot via a for loop

2016-11-01 Thread Bede-Fazekas Ákos

Dear David,

since you have not provided us a reproductable script+dataset, I can't 
check whether my suggestion is connected to your problem. Maybe not... 
Anyway, lattice plotter functions - such as spplot() and its relatives 
-  inside for loops, functions, and tryCatch() blocks usually work in a 
different way than their line-by-line versions. Using print(spplot()) 
instead of spplot() can solve the problem.


HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2016.11.01. 21:54 keltezéssel, David Wang írta:

Hello,


I need to overlay a set of tracks (SpatialPoints objects) on a map and thought 
I would add them one by one to a lattice plot of coastline. However, when I did


p <- spplot(coast, xlim = xlim, ylim = ylim, col.regions = "darkgray", colorkey 
= FALSE)
for (k in seq_along(tracks))
   p <- p + layer(sp.points(tracks[[k]], pch = 20, col = k))
p

(where coast is the coastline as a SpatialLines object, and tracks is a list of 
SpatialPoints objects), p only shows the coastline and the last track. What's 
perplexing to me is that when I explicitly added layers without using a for 
loop, the result trellis object p does contain all four tracks in their 
specific colors:




p <- spplot(coast, xlim = xlim, ylim = ylim, col.regions = "darkgray", colorkey 
= FALSE)
p <- p + layer(sp.points(tracks[[1]], pch = 20, col = 1))
p <- p + layer(sp.points(tracks[[2]], pch = 20, col = 2))
p <- p + layer(sp.points(tracks[[3]], pch = 20, col = 3))
p <- p + layer(sp.points(tracks[[4]], pch = 20, col = 4))
p

Does anyone happen to have a clue why the for loop failed to overlay layers?

Thanks,
David

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Kriging with uncertain data

2016-10-07 Thread Bede-Fazekas Ákos

Dear Santiago,

in this case I would interpolate/krige the uncertainty as well. Since 
uncertainty might have different distribution, different covariates, and 
different spatial autocorrelation than those of the measured value, I 
would build a new kriging model (fit new semivariogram, etc.) and 
interpret the predicted values of krige() as the predicted measurement 
uncertainty. I'll have now two uncertainty maps: one is the 
interpolation uncertainty (variance from the result of the 
krige(measurement)) and the other is the interpolated measurement 
uncertainty (predicted values from the result of krige(uncertainty)). 
Afterwards, these two can be combined or can be used separately as well.


Have a nice day,
Ákos

2016.10.07. 10:27 keltezéssel, Santiago Beguería írta:

Dear Ákos,

I was referring to the former: I have data with two values at each location: 
measured value and uncertainty of the measurement. So, each observation is in 
fact a statistical variate, which we can assume is Gaussian distributed. Hence, 
my two values are the expected (mean) and the variance of the distribution.

Cheers,

Stg



El 7 oct 2016, a las 8:39, Bede-Fazekas Ákos <bfalevl...@gmail.com> escribió:

Dear Santiago,

you mean you have two values at each location (observed value and uncertainty)? Or 
you have an observed value that is the sum of the real value and the observation 
error (uncertainty). If the last, then I think using the gstat::krige() function is 
straightforward, since the result of the function contains the variance of the 
prediction ("Attributes columns contain prediction and
prediction variance"; https://cran.r-project.org/web/packages/gstat/gstat.pdf).

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences



2016.10.06. 11:52 keltezéssel, Santiago Beguería írta:

Dear R-sig-geo list members,

I am curious about what are sensible approaches to spatial interpolation, most 
especially by using kriging, in the context of uncertain data.

Suppose one has a dataset of values observed at different locations, and each 
value consists on the expected value and its variance. Variance here represents 
the uncertainty related to the observation, and shows spatial variation due to 
external factors, for instance the geological setting affecting the quality of 
the measurement.

How would you proceed to model the spatial distribution of this variable, 
including propagation of the (spatially varying)?

I suppose one approach could be by simulation, but at there other ways of 
propagating the uncertainty that do not involve potentially expensive (in 
computation time) simulation approaches?

Cheers,

Santiago Beguería
CSIC
Spain

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: [R-sig-Geo] Kriging with uncertain data

2016-10-07 Thread Bede-Fazekas Ákos

Dear Santiago,

you mean you have two values at each location (observed value and 
uncertainty)? Or you have an observed value that is the sum of the real 
value and the observation error (uncertainty). If the last, then I think 
using the gstat::krige() function is straightforward, since the result 
of the function contains the variance of the prediction ("Attributes 
columns contain prediction and
prediction variance"; 
https://cran.r-project.org/web/packages/gstat/gstat.pdf).


HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences



2016.10.06. 11:52 keltezéssel, Santiago Beguería írta:

Dear R-sig-geo list members,

I am curious about what are sensible approaches to spatial interpolation, most 
especially by using kriging, in the context of uncertain data.

Suppose one has a dataset of values observed at different locations, and each 
value consists on the expected value and its variance. Variance here represents 
the uncertainty related to the observation, and shows spatial variation due to 
external factors, for instance the geological setting affecting the quality of 
the measurement.

How would you proceed to model the spatial distribution of this variable, 
including propagation of the (spatially varying)?

I suppose one approach could be by simulation, but at there other ways of 
propagating the uncertainty that do not involve potentially expensive (in 
computation time) simulation approaches?

Cheers,

Santiago Beguería
CSIC
Spain

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: [R-sig-Geo] Create circular polygon of certain radius in R

2016-06-08 Thread Bede-Fazekas Ákos

Hi Giacomo,

you should use a projection that is planar (azimuthal). Select one 
planar projection, use spTransform() from sp package to transform your 
data to the selected planar projection, and then everything will works fine.

Some links:
https://en.wikipedia.org/wiki/Map_projection#Azimuthal_.28projections_onto_a_plane.29
http://spatialreference.org/ref/epsg/
http://www.inside-r.org/packages/cran/rgdal/docs/spTransform

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2016.06.08. 13:26 keltezéssel, Giacomo May írta:

Hi,
I have a raster, which is projected into "+proj=longlat +datum=WGS84 +no_defs +ellps=WGS84 +towgs84=0,0,0". Now I would 
like to create a circular Polygon with a radius of 300 meters around a cell of my choice. I know about the function 
"spCircle()" in the package "sampSurf" and I heard that this should somehow be possibleusing the 
"gBuffer"-function from  the "rgeos"-package, but I don't know exactly how to use it.
I have tried achieving this with the following piece of code (altdata is the 
raster I am using):

coords <- xyFromCell(altdata,12703)
proj <- altdata@crs@projargs
coords_sp <- SpatialPoints(coords,CRS(proj))
polygon <- gBuffer(coords_sp,width=300)

But this returns the following warning message:

Warning message:

In gBuffer(coords_sp, width = 300) :
   Spatial object is not projected; GEOS expects planar coordinates

If anyone could help me with this I would be deeply grateful.
Best regards,
Giacomo May

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] R on Mac building up memory usage

2016-05-27 Thread Bede-Fazekas Ákos
Hello Didier,
only restarting R can really free up memory. This is a typical issue on 
some operating systems.
"Why is R apparently not releasing memory? [...] This is an artifact of 
the way the operating system (OS) allocates memory. In general it is 
common that the OS is not capable of releasing all unused memory. In 
extreme cases it is possible that even if R frees almost all its memory, 
the OS can not release any of it due to its design and thus tools such 
as ps or top will report substantial amount of resident RAM used by the 
R process even though R has released all that memory. [...] The short 
answer is that this is a limitation of the memory allocator in the 
operating system and there is nothing R can do about it."
See details here:
http://www.hep.by/gnu/r-patched/r-faq/R-FAQ_93.html
Best wishes,
�kos

2016.05.27. 20:20 keltez�ssel, Roger Bivand �rta:
> On Fri, 27 May 2016, Dr Didier G. Leibovici wrote:
>
>> Hi,
>>
>> I guess this may be not a specificity of r-sig-geo but as I am using
>> library(rgdal)
>> library(rgeos)
>>
>> in this script so it may be the reason? (perhaps I should try running
>> something else to check).
>>
>
> Hi Didier,
>
> Trying a bare bones script may be sensible. There shouldn't be 
> anything in those packages that creates these effects as such. How 
> many cores are running R simultaneously? There shouldn't be anything 
> OSX-specific either, though memory management varies across platforms 
> (there was a recent discussion on R-devel about this).
>
> So if you could share such a bare-bones script and simulated data 
> setup, others might be able to contribute.
>
> Best wishes,
>
> Roger
>
>>
>> So basically I am running some code reading  61288 features and other
>> things ... if I run it once I got in gc():
>> > gc()
>>   used (Mb) gc trigger  (Mb) max used  (Mb)
>> Ncells 1833926 98.05103933 272.6  9968622 532.4
>> Vcells 2437534 18.67056348  53.9 11036325  84.3
>>
>> and on the monitor it says R is using 3.3Go.
>>
>> Then I remove everything rm(list=ls()) and run it again trying different
>> sets of parameters for example.
>> Second run similar gc() but R is using 6.4Go
>> > gc()
>>   used (Mb) gc trigger  (Mb) max used  (Mb)
>> Ncells 1834325 98.06323353 337.8  9968622 532.4
>> Vcells 2439267 18.77572947  57.8 11832730  90.3
>>
>>
>> After a while and few other computation I get R is using 10Go
>> and gc() gives
>> > gc()
>>   used (Mb) gc trigger  (Mb)  max used  (Mb)
>> Ncells 1863272 99.65944937 317.5   9968622 532.4
>> Vcells 2503503 19.28462995  64.6 100858653 769.5
>> rm(list=ls())
>> > ls()
>> character(0)
>> >gc()
>>  used (Mb) gc trigger  (Mb)  max used  (Mb)
>> Ncells 608451 32.54755949 254.0   9968622 532.4
>> Vcells 760517  5.96770396  51.7 100858653 769.5
>>
>> but  still 10Go for R in the monitor.
>> I had experienced a building up to 50Go then my system tells me to close
>> some apps, all that doing the same running one set, then rm(list=ls())
>> ... So at the moment I just have to close and re-start R?
>>
>> Didier
>>
>>
>


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: [R-sig-Geo] extracting centroids from map data

2016-05-19 Thread Bede-Fazekas Ákos

Hello Andy,
you'll need these:

library(rgeos)
centroids <- gCentroid(states, byid=TRUE, id = NULL)

after you convert your data.frame to SpatialPolygons or 
SpatialPolygonsDataFrame (package sp).

HTH,
Ákos

2016.05.20. 5:41 keltezéssel, Andy Bunn írta:

Hello all, I'm working on a map. I'd like to extract centroids of the counties 
below in order to plot them as points. How can I do this?

Many thanks, Andy



library(ggplot2)
library(maps)
all_states <- map_data("county")
states <- subset(all_states, region %in% c("iowa",  "illinois", "indiana",
"michigan", "minnesota","ohio",
"wisconsin"))
p <- ggplot()
p <- p + geom_polygon( data=states, aes(x=long, y=lat, group = group),
color="grey10", fill="white" )
p <- p + theme(aspect.ratio=1)
p


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] rgdal 1.1-3 compilation: error of ogrsource.cpp

2016-01-31 Thread Bede-Fazekas Ákos

Thank you, Roger!
It is now solved. Hereinafter I just provide information about this 
issue and how it was solved for those who are interested in.

Best wishes,
Ákos Bede-Fazekas

Only two ogr_api.h files were situated in my hard drive, and they were 
exactly the same. One was in the directory where gdal were compiled 
from, the other was in

/usr/local/include
Only one line mentioned GIntBig:
GIntBig CPL_DLL OGR_L_GetFeaturesRead( OGRLayerH );
Header of ogr_api.h was:
"$Id: ogr_api.h 28008 2014-11-26 12:41:43Z rouault $"

Deletion (backuping) ogr_api.h, downloading the newest gdal source, 
compiling it, running ldconfig (because of " libgdal.so.20: cannot open 
shared object file: No such file or directory" during installation of 
rgdal package) and install rgdal from R solved the problem:


sudo mv /usr/local/include/ ogr_api.h /usr/local/include/ ogr_api.h.bak
wget -A.gz ​http://download.osgeo.org/gdal/2.0.2/gdal-2.0.2.tar.gz
tar xzf gdal-2.0.2.tar.gz
cd gdal-2.0.2
./configure
sudo make
sudo make install
sudo ldconfig
sudo R
install.packages("rgdal")

2016.01.31. 13:59 keltezéssel, Roger Bivand írta:

On Sun, 31 Jan 2016, Bede-Fazekas Ákos wrote:


Dear Roger,
it is really probable that some gdal versions were mixed. It was long 
time ago when I installed gdal but as far as I remember it was 
compiled from source. I would be very happy if I could uninstall it 
and start again from the base, but don't know how to do it. "make 
uninstall" fails, reinstalling also fails, no other ways I'm aware of.


Looking more closely at your report, I see:


>  ogr2ogr --version
>  GDAL 2.0.0dev, released 2014/04/16


2.0.0 was released in June 2015 (and 2.0.2 has just been released). 
You very likely have installed headers that date from before the 
revisions introducing handling of 64-bit integers, but where the 
version of GDAL declares itself to be 2.0.0. I suggest you use locate 
from the shell to find any ogr_api.h files on your system. You could use


gdal-config --cflags

but this could be misleading if you have multiple gdal-config 
installed. Find out whether any ogr_api.h contains references to 
GIntBig, and delete all prior to that. It may be safest to locate all 
the *gdal*, *cpl* and *ogr* components and remove them, running make 
install again from a fresh source build of GDAL 2.0.2.


Hope this helps,

Roger



Thanks for any help,
Ákos

2016.01.30. 17:33 keltezéssel, Roger Bivand írta:

 On Sat, 30 Jan 2016, Bede-Fazekas Ákos wrote:

>  Dear List,
>  please let me know what I'm doing wrong. Installing package rgdal 
1.1-3 >  with install.packages("rgdal") fails on Ubuntu 14 (gdal 
version: 2.0.0) >  with this error message during the compilation:
>  *ogrsource.cpp:413:10: error: ‘OFTInteger64’ was not declared in 
this >  scope.*
> >  The gdal-version info, verbose output of package installation 
and >  sessionInfo() is provided after my signature.


 Thanks for a clear report. The GDAL version is declared as 2.0.0. The
 released 2.0.* implement RFC 31:

 (https://trac.osgeo.org/gdal/wiki/rfc31_ogr_64)

 using long integers for FIDs and allowing them as field values. 
Since the

 GDAL you are using declares itself as 2.*, the code forks to take
 advantage of the new capability.

 How did you install GDAL? Is it possible that the version you have is
 pre-release? The development trunk was named 2.0.0 for quite a long 
time,
 but implementations of changes happened at different times along 
the way?
 Is it possible that you have mixed (multiple) GDAL installations, 
with 1.*

 headers and 2.0 gdal-config?

 Roger

>  However the R version is the latest (3.2.3), installation of the 
package >  failed with similar error message on my previously used R 
version >  (3.0.2), and therefore I think that the problem is 
nothing to do with R.
>  Removing and reinstalling libgdal1-dev, libproj-dev, gdal-bin 
does not >  solve the problem. Running R as root user produce 
similar error.

> >  Any help is gratefully appreciated,
>  Ákos Bede-Fazekas
>  Hungarian Academy of Sciences
> > >  
>  gdal-config --version
>  2.0.0
> >  ogr2ogr --version
>  GDAL 2.0.0dev, released 2014/04/16
> >  
> configure:   CC: gcc -std=gnu99
> configure:   CXX: g++
> configure:   rgdal: 1.1-3
>  checking for /usr/bin/svnversion... yes
>  configure:  svn revision: 594
>  checking for gdal-config... /usr/local/bin/gdal-config
>  checking gdal-config usability... yes
>  configure:  GDAL: 2.0.0
>  checking GDAL version >= 1.6.3... yes
>  configure:  experimental conditional use of GDAL2
>  checking for gcc... gcc -std=gnu99
>  checking whether the C compiler works... yes
>  checking for C compiler default output file name... a.out
>  checking for suffix of executables...
>  checking whether we are cross compiling... n

Re: [R-sig-Geo] rgdal 1.1-3 compilation: error of ogrsource.cpp

2016-01-31 Thread Bede-Fazekas Ákos

Dear Roger,
it is really probable that some gdal versions were mixed. It was long 
time ago when I installed gdal but as far as I remember it was compiled 
from source. I would be very happy if I could uninstall it and start 
again from the base, but don't know how to do it. "make uninstall" 
fails, reinstalling also fails, no other ways I'm aware of.

Thanks for any help,
Ákos

2016.01.30. 17:33 keltezéssel, Roger Bivand írta:

On Sat, 30 Jan 2016, Bede-Fazekas Ákos wrote:


Dear List,
please let me know what I'm doing wrong. Installing package rgdal 
1.1-3 with install.packages("rgdal") fails on Ubuntu 14 (gdal 
version: 2.0.0) with this error message during the compilation:
*ogrsource.cpp:413:10: error: ‘OFTInteger64’ was not declared in this 
scope.*


The gdal-version info, verbose output of package installation and 
sessionInfo() is provided after my signature.


Thanks for a clear report. The GDAL version is declared as 2.0.0. The 
released 2.0.* implement RFC 31:


(https://trac.osgeo.org/gdal/wiki/rfc31_ogr_64)

using long integers for FIDs and allowing them as field values. Since 
the GDAL you are using declares itself as 2.*, the code forks to take 
advantage of the new capability.


How did you install GDAL? Is it possible that the version you have is 
pre-release? The development trunk was named 2.0.0 for quite a long 
time, but implementations of changes happened at different times along 
the way? Is it possible that you have mixed (multiple) GDAL 
installations, with 1.* headers and 2.0 gdal-config?


Roger

However the R version is the latest (3.2.3), installation of the 
package failed with similar error message on my previously used R 
version (3.0.2), and therefore I think that the problem is nothing to 
do with R.
Removing and reinstalling libgdal1-dev, libproj-dev, gdal-bin does 
not solve the problem. Running R as root user produce similar error.


Any help is gratefully appreciated,
Ákos Bede-Fazekas
Hungarian Academy of Sciences



gdal-config --version
2.0.0

ogr2ogr --version
GDAL 2.0.0dev, released 2014/04/16


configure:  CC: gcc -std=gnu99
configure:  CXX: g++
configure:  rgdal: 1.1-3
checking for /usr/bin/svnversion... yes
configure:  svn revision: 594
checking for gdal-config... /usr/local/bin/gdal-config
checking gdal-config usability... yes
configure:  GDAL: 2.0.0
checking GDAL version >= 1.6.3... yes
configure:  experimental conditional use of GDAL2
checking for gcc... gcc -std=gnu99
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables...
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc -std=gnu99 accepts -g... yes
checking for gcc -std=gnu99 option to accept ISO C89... none needed
checking how to run the C preprocessor... gcc -std=gnu99 -E
checking for grep that handles long lines and -e... /bin/grep
checking for egrep... /bin/grep -E
checking for ANSI C header files... yes
checking for sys/types.h... yes
checking for sys/stat.h... yes
checking for stdlib.h... yes
checking for string.h... yes
checking for memory.h... yes
checking for strings.h... yes
checking for inttypes.h... yes
checking for stdint.h... yes
checking for unistd.h... yes
checking gdal.h usability... yes
checking gdal.h presence... yes
checking for gdal.h... yes
checking gdal: linking with --libs only... yes
checking GDAL: /usr/local/share/gdal/pcs.csv readable... yes
checking proj_api.h usability... yes
checking proj_api.h presence... yes
checking for proj_api.h... yes
checking for pj_init_plus in -lproj... yes
configure:  PROJ.4 version: 4.8.0
checking PROJ.4: epsg found and readable... yes
checking PROJ.4: conus found and readable... yes
configure:  Package CPP flags:  -I/usr/local/include
configure:  Package LIBS:  -L/usr/local/lib -lgdal -lproj
configure:  creating ./config.status
config.status: creating src/Makevars
** libs
g++ -std=c++11 -I/usr/share/R/include -DNDEBUG -I/usr/local/include 
-I"/usr/local/lib/R/site-library/sp/include"   -fpic  -g -O2 
-fstack-protector --param=ssp-buffer-size=4 -Wformat 
-Werror=format-security -D_FORTIFY_SOURCE=2 -g -c OGR_write.cpp -o 
OGR_write.o
g++ -std=c++11 -I/usr/share/R/include -DNDEBUG -I/usr/local/include 
-I"/usr/local/lib/R/site-library/sp/include"   -fpic  -g -O2 
-fstack-protector --param=ssp-buffer-size=4 -Wformat 
-Werror=format-security -D_FORTIFY_SOURCE=2 -g -c gdal-bindings.cpp 
-o gdal-bindings.o
gcc -std=gnu99 -I/usr/share/R/include -DNDEBUG -I/usr/local/include 
-I"/usr/local/lib/R/site-library/sp/include"   -fpic  -g -O2 
-fstack-protector --param=ssp-buffer-size=4 -Wformat 
-Werror=format-security -D_FORTIFY_SOURCE=2 -g  -c init.c -o init.o
gcc -std=gnu99 -I/usr/share/R/include -DNDEBUG -I/usr/local/include 
-I"/usr/local/lib/R/site-library/sp/include&quo

[R-sig-Geo] rgdal 1.1-3 compilation: error of ogrsource.cpp

2016-01-30 Thread Bede-Fazekas Ákos

Dear List,
please let me know what I'm doing wrong. Installing package rgdal 1.1-3 
with install.packages("rgdal") fails on Ubuntu 14 (gdal version: 2.0.0) 
with this error message during the compilation:
*ogrsource.cpp:413:10: error: ‘OFTInteger64’ was not declared in this 
scope.*


The gdal-version info, verbose output of package installation and 
sessionInfo() is provided after my signature.
However the R version is the latest (3.2.3), installation of the package 
failed with similar error message on my previously used R version 
(3.0.2), and therefore I think that the problem is nothing to do with R.
Removing and reinstalling libgdal1-dev, libproj-dev, gdal-bin does not 
solve the problem. Running R as root user produce similar error.


Any help is gratefully appreciated,
Ákos Bede-Fazekas
Hungarian Academy of Sciences



gdal-config --version
2.0.0

ogr2ogr --version
GDAL 2.0.0dev, released 2014/04/16


configure: CC: gcc -std=gnu99
configure: CXX: g++
configure: rgdal: 1.1-3
checking for /usr/bin/svnversion... yes
configure: svn revision: 594
checking for gdal-config... /usr/local/bin/gdal-config
checking gdal-config usability... yes
configure: GDAL: 2.0.0
checking GDAL version >= 1.6.3... yes
configure: experimental conditional use of GDAL2
checking for gcc... gcc -std=gnu99
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables...
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc -std=gnu99 accepts -g... yes
checking for gcc -std=gnu99 option to accept ISO C89... none needed
checking how to run the C preprocessor... gcc -std=gnu99 -E
checking for grep that handles long lines and -e... /bin/grep
checking for egrep... /bin/grep -E
checking for ANSI C header files... yes
checking for sys/types.h... yes
checking for sys/stat.h... yes
checking for stdlib.h... yes
checking for string.h... yes
checking for memory.h... yes
checking for strings.h... yes
checking for inttypes.h... yes
checking for stdint.h... yes
checking for unistd.h... yes
checking gdal.h usability... yes
checking gdal.h presence... yes
checking for gdal.h... yes
checking gdal: linking with --libs only... yes
checking GDAL: /usr/local/share/gdal/pcs.csv readable... yes
checking proj_api.h usability... yes
checking proj_api.h presence... yes
checking for proj_api.h... yes
checking for pj_init_plus in -lproj... yes
configure: PROJ.4 version: 4.8.0
checking PROJ.4: epsg found and readable... yes
checking PROJ.4: conus found and readable... yes
configure: Package CPP flags:  -I/usr/local/include
configure: Package LIBS:  -L/usr/local/lib -lgdal -lproj
configure: creating ./config.status
config.status: creating src/Makevars
** libs
g++ -std=c++11 -I/usr/share/R/include -DNDEBUG -I/usr/local/include 
-I"/usr/local/lib/R/site-library/sp/include"   -fpic  -g -O2 
-fstack-protector --param=ssp-buffer-size=4 -Wformat 
-Werror=format-security -D_FORTIFY_SOURCE=2 -g -c OGR_write.cpp -o 
OGR_write.o
g++ -std=c++11 -I/usr/share/R/include -DNDEBUG -I/usr/local/include 
-I"/usr/local/lib/R/site-library/sp/include"   -fpic  -g -O2 
-fstack-protector --param=ssp-buffer-size=4 -Wformat 
-Werror=format-security -D_FORTIFY_SOURCE=2 -g -c gdal-bindings.cpp -o 
gdal-bindings.o
gcc -std=gnu99 -I/usr/share/R/include -DNDEBUG -I/usr/local/include 
-I"/usr/local/lib/R/site-library/sp/include"   -fpic  -g -O2 
-fstack-protector --param=ssp-buffer-size=4 -Wformat 
-Werror=format-security -D_FORTIFY_SOURCE=2 -g  -c init.c -o init.o
gcc -std=gnu99 -I/usr/share/R/include -DNDEBUG -I/usr/local/include 
-I"/usr/local/lib/R/site-library/sp/include"   -fpic  -g -O2 
-fstack-protector --param=ssp-buffer-size=4 -Wformat 
-Werror=format-security -D_FORTIFY_SOURCE=2 -g  -c local_stubs.c -o 
local_stubs.o
g++ -std=c++11 -I/usr/share/R/include -DNDEBUG -I/usr/local/include 
-I"/usr/local/lib/R/site-library/sp/include"   -fpic  -g -O2 
-fstack-protector --param=ssp-buffer-size=4 -Wformat 
-Werror=format-security -D_FORTIFY_SOURCE=2 -g -c ogr_geom.cpp -o ogr_geom.o
gcc -std=gnu99 -I/usr/share/R/include -DNDEBUG -I/usr/local/include 
-I"/usr/local/lib/R/site-library/sp/include"   -fpic  -g -O2 
-fstack-protector --param=ssp-buffer-size=4 -Wformat 
-Werror=format-security -D_FORTIFY_SOURCE=2 -g  -c ogr_polygons.c -o 
ogr_polygons.o
g++ -std=c++11 -I/usr/share/R/include -DNDEBUG -I/usr/local/include 
-I"/usr/local/lib/R/site-library/sp/include"   -fpic  -g -O2 
-fstack-protector --param=ssp-buffer-size=4 -Wformat 
-Werror=format-security -D_FORTIFY_SOURCE=2 -g -c ogr_proj.cpp -o ogr_proj.o
g++ -std=c++11 -I/usr/share/R/include -DNDEBUG -I/usr/local/include 
-I"/usr/local/lib/R/site-library/sp/include"   -fpic  -g -O2 
-fstack-protector --param=ssp-buffer-size=4 -Wformat 
-Werror=format-security -D_FORTIFY_SOURCE=2 -g -c ogrdrivers.cpp -o 
ogrdrivers.o
g++ 

Re: [R-sig-Geo] Regarding installation of R package rgdal

2016-01-13 Thread Bede-Fazekas Ákos

Dear Subhash,

is gdal installed on your system?
Before installing package rgdal, you should download and install this:
software.opensuse.org/ymp/openSUSE:Leap:42.1/standard/gdal.ymp?base=openSUSE%3ALeap%3A42.1=gdal

Or if this is not the best link for your operation system, here you can 
find the GDAL-webpage:

https://trac.osgeo.org/gdal/wiki/DownloadingGdalBinaries

Hope this helps,
Ákos Bede-Fazekas
Hungarian Academy of Science

2016.01.13. 10:48 keltezéssel, Subhash Karemore írta:

Hi,

I am trying to install R package 'rgdal' on SUSE Linux Enterprise Server. 
However it is failing with compilation error.

Errors:
--
I/usr/lib64/R/include -DNDEBUG -I/usr/local/include -I/usr/local/include 
-I"/usr/lib64/R/library/sp/include"  -c OGR_write.cpp -o OGR_write.o
/bin/sh: I/usr/lib64/R/include: No such file or directory
make: [OGR_write.o] Error 127 (ignored)
I/usr/lib64/R/include -DNDEBUG -I/usr/local/include -I/usr/local/include 
-I"/usr/lib64/R/library/sp/include"  -c gdal-bindings.cpp -o gdal-bindings.o
/bin/sh: I/usr/lib64/R/include: No such file or directory
make: [gdal-bindings.o] Error 127 (ignored)
gcc -std=gnu99 -I/usr/lib64/R/include -DNDEBUG -I/usr/local/include -I/usr/local/include 
-I"/usr/lib64/R/library/sp/include"   -fpic  -fmessage-length=0 -O2 -Wall 
-D_FORTIFY_SOURCE=2 -fstack-protector -funwind-tables -fasynchronous-unwind-tables  -c 
init.c -o init.o
gcc -std=gnu99 -I/usr/lib64/R/include -DNDEBUG -I/usr/local/include -I/usr/local/include 
-I"/usr/lib64/R/library/sp/include"   -fpic  -fmessage-length=0 -O2 -Wall 
-D_FORTIFY_SOURCE=2 -fstack-protector -funwind-tables -fasynchronous-unwind-tables  -c 
local_stubs.c -o local_stubs.o
I/usr/lib64/R/include -DNDEBUG -I/usr/local/include -I/usr/local/include 
-I"/usr/lib64/R/library/sp/include"  -c ogr_geom.cpp -o ogr_geom.o
/bin/sh: I/usr/lib64/R/include: No such file or directory
make: [ogr_geom.o] Error 127 (ignored)
gcc -std=gnu99 -I/usr/lib64/R/include -DNDEBUG -I/usr/local/include -I/usr/local/include 
-I"/usr/lib64/R/library/sp/include"   -fpic  -fmessage-length=0 -O2 -Wall 
-D_FORTIFY_SOURCE=2 -fstack-protector -funwind-tables -fasynchronous-unwind-tables  -c 
ogr_polygons.c -o ogr_polygons.o
I/usr/lib64/R/include -DNDEBUG -I/usr/local/include -I/usr/local/include 
-I"/usr/lib64/R/library/sp/include"  -c ogr_proj.cpp -o ogr_proj.o
/bin/sh: I/usr/lib64/R/include: No such file or directory
make: [ogr_proj.o] Error 127 (ignored)
I/usr/lib64/R/include -DNDEBUG -I/usr/local/include -I/usr/local/include 
-I"/usr/lib64/R/library/sp/include"  -c ogrdrivers.cpp -o ogrdrivers.o
/bin/sh: I/usr/lib64/R/include: No such file or directory
make: [ogrdrivers.o] Error 127 (ignored)
I/usr/lib64/R/include -DNDEBUG -I/usr/local/include -I/usr/local/include 
-I"/usr/lib64/R/library/sp/include"  -c ogrsource.cpp -o ogrsource.o
/bin/sh: I/usr/lib64/R/include: No such file or directory
make: [ogrsource.o] Error 127 (ignored)
I/usr/lib64/R/include -DNDEBUG -I/usr/local/include -I/usr/local/include 
-I"/usr/lib64/R/library/sp/include"  -c projectit.cpp -o projectit.o
/bin/sh: I/usr/lib64/R/include: No such file or directory
make: [projectit.o] Error 127 (ignored)
-shared -L/usr/lib64/R/lib -L/usr/local/lib64 -o rgdal.so OGR_write.o 
gdal-bindings.o init.o local_stubs.o ogr_geom.o ogr_polygons.o ogr_proj.o 
ogrdrivers.o ogrsource.o projectit.o -L/usr/local/lib -lgdal -lproj 
-L/usr/lib64/R/lib -lR
/bin/sh: line 2: -shared: command not found
make: *** [rgdal.so] Error 127
ERROR: compilation failed for package 'rgdal'
* removing '/usr/lib64/R/library/rgdal'

The downloaded source packages are in
 '/tmp/Rtmpd0WifB/downloaded_packages'
Updating HTML index of packages in '.Library'
Making 'packages.html' ... done
Warning message:
In install.packages("rgdal", repos = "http://cran.r-project.org;) :
   installation of package 'rgdal' had non-zero exit status

However it got installed successfully on openSUSE. Here is the output:
---
g++ -std=c++11 -I/usr/lib64/R/include -DNDEBUG -I/usr/include/gdal -I/usr/local/include 
-I"/usr/lib64/R/library/sp/include"   -fpic  -fmessage-length=0 
-grecord-gcc-switches -O2 -Wall -D_FORTIFY_SOURCE=2 -fstack-protector -funwind-tables 
-fasynchronous-unwind-tables -g -c OGR_write.cpp -o OGR_write.o
g++ -std=c++11 -I/usr/lib64/R/include -DNDEBUG -I/usr/include/gdal -I/usr/local/include 
-I"/usr/lib64/R/library/sp/include"   -fpic  -fmessage-length=0 
-grecord-gcc-switches -O2 -Wall -D_FORTIFY_SOURCE=2 -fstack-protector -funwind-tables 
-fasynchronous-unwind-tables -g -c gdal-bindings.cpp -o gdal-bindings.o
gcc -std=gnu99 -I/usr/lib64/R/include -DNDEBUG -I/usr/include/gdal -I/usr/local/include 
-I"/usr/lib64/R/library/sp/include"   -fpic  -fmessage-length=0 
-grecord-gcc-switches -O2 -Wall -D_FORTIFY_SOURCE=2 -fstack-protector -funwind-tables 
-fasynchronous-unwind-tables -g  -c init.c -o init.o
gcc -std=gnu99 

Re: [R-sig-Geo] dismo package- data.frame of fitted function and variable values

2016-01-13 Thread Bede-Fazekas Ákos

Dear Robert,
here you go the code that you need, based on the source code of 
gbm.plot() function:


gbm.object<-angaus.tc5.lr01
gbm.call <- gbm.object$gbm.call
data <- gbm.call$dataframe
pred.names <- gbm.call$predictor.names
k <- match("SegSumT",pred.names)
pred.data <- data[ , gbm.call$gbm.x[k]]
response.matrix <- gbm::plot.gbm(gbm.object, k, return.grid = TRUE)
predictors <- response.matrix[,1]
if (is.factor(data[,gbm.call$gbm.x[k]])) {
predictors <- factor(predictors,levels = 
levels(data[,gbm.call$gbm.x[k]]))

}
responses <- response.matrix[,2] - mean(response.matrix[,2])
data.frame(predictors, responses)

Kind regards,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2016.01.13. 14:18 keltezéssel, Rob Deg írta:

Dear R-users,

Iam using the boosted regression trees to model the occurrence of
presence/absence data by using environmental factors.
Iam wondering how its possible to print out the data.frame of fitted
functions and variable values that were used in producing the gbm.plot.
For example, values of fitted functions (y-axis) and SegSumT (x-axis) from
the following plot provided in the dismo package example

==
library(dismo)
data(Anguilla_train)
Anguilla_train = Anguilla_train[1:200,]
angaus.tc5.lr01 <- gbm.step(data=Anguilla_train, gbm.x = 3:14, gbm.y = 2,
family = "bernoulli", tree.complexity = 5, learning.rate = 0.01,
bag.fraction = 0.5)
gbm.plot(angaus.tc5.lr01)

=

Best,

Robert

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] accuracy assessment of tematic maps in R

2015-12-31 Thread Bede-Fazekas Ákos

Dear Cristián,
there migt be several packages that can compute confusion matrix (aka. 
contingency table):

??contingency

One of them that I can recommend you is package 'ROCR' and functions 
ROCR::prediction() and ROCR::performance().


## Creates a ROCR::prediction object:
ROCR_object <- prediction(predictions = predicted_values, labels = 
observed_values)


## Calculates performance measures:
AUC_value <- performance(prediction.obj = ROCR_object, measure = 
"auc")@y.values[[1]]
TPR_value <- performance(prediction.obj = ROCR_object, measure = 
"tpr")@y.values[[1]]


## Calculates performance graphs:
ROC_curve <- performance(prediction.obj = ROCR_object, measure = "tpr", 
x.measure = "fpr")
precision_recall <- performance(prediction.obj = ROCR_object, measure = 
"prec", x.measure = "rec")
sensitivity_specificity <- performance(prediction.obj = ROCR_object, 
measure = "sens", x.measure = "spec")

plot(ROC_curve)
plot(precision_recall)
plot(sensitivity_specificity)

HTH,
Ákos Bede-Fazekas
Hungary

2015.12.31. 4:31 keltezéssel, CRISTIAN ANDRES VERGARA FERNANDEZ írta:

Dear all,

I have done a number of land cover maps using support vector machine from
landsat 8. I would like to automatize an accuracy assessment procedure
based on a confusion matrix to evaluate the classification maps obtained. I
have not been able to easily find a package containing the confusion matrix
function. Would you recommend me a way to do this using R?

Many thanks in advance



___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

[R-sig-Geo] gstat::krige() - regression kriging vs. kriging with external drift

2015-11-14 Thread Bede-Fazekas Ákos
Dear List, dear Edzer,
is it correct if I use the term "regression kriging" when I run this
function?:
kriged_value <- gstat::krige(z ~ x + y, [...])@data$var1.pred

Or should I call it "kriging with external drift" (or "universal kriging" if
x and y are coordinates), and use the term "regression kriging" only in the
case of running this?:
linear_model <- lm(z ~ x + y, [...])
residuals <- linear_model$residuals
kriged_residuals <- gstat::krige(residuals ~ 1, [...])@data$var1.pred
kriged_value <- linear_model$fitted.values + kriged_residuals

Thank you in advance,
Ákos Bede-Fazekas
Budapest, Hungary

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] converting latlong in decimal degrees to utm

2015-07-18 Thread Bede-Fazekas Ákos
Dear Moses,
you need firstly install the package sp, then load it, and after that you 
should convert your data.frame to SpatialPoints or SpatialPointsDataFrame by 
setting the coordinates.

if (!(sp %in% row.names(installed.packages({ # if package sp not yet 
installed
suppressMessages(install.packages(sp, verbose = FALSE, quiet = TRUE, 
INSTALL_opts = c(--no-lock), repos = http://cran.us.r-project.org;)) # 
installs it
} # if
library(sp) # loads it
coordinates(data) - c(x, y) # sets coordinates
if (class(data) == SpatialPoints) {writeLines(Well done.)} # Now you can 
set projection.

Hope it helps,
Ákos

-Original Message-
From: R-sig-Geo [mailto:r-sig-geo-boun...@r-project.org] On Behalf Of moses 
selebatso
Sent: 2015. július 18. 17:27
To: R-sig-Geo@r-project.org
Subject: [R-sig-Geo] converting latlong in decimal degrees to utm

I am trying to get my data (sample below) to utm before I can use it for Kernel 
Density Estimates. Can someone help?
 data-read.csv(clipboard,sep=\t)
 head(data)
  xy
1 -21.76580 23.20185
2 -21.76692 23.20126
3 -21.76690 23.20127
4 -21.76761 23.20338
5 -21.77223 23.20354
6 -21.77193 23.20327
 proj4string(data) - CRS(+proj=utm +zone=34 +south +ellps=WGS84 
 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs)
Error in (function (classes, fdef, mtable)  : 
  unable to find an inherited method for function ‘proj4string-’ for signature 
‘data.frame, CRS’
Thank you, in advance. Moses SELEBATSO

(+267) 318 5219 (H) (+267) 716 393 70 (C)
  (+267) 738 393 70 (C
[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[R-sig-Geo] regression kriging with singular variogram

2015-01-25 Thread Bede-Fazekas Ákos
Dear All,
I have a simple question about regression kriging (gstat:::krige()) and
variogram fitting (gstat:::fit.variogram()), which I have not found exact
answer for (neither in the description of package gstat, nor on the
internet). Does it cause any problem if I do regression kriging with a
fitted variogram (eg. a spherical or exponential type) that is singular (ie.
I got a message 'warning: singular model in variogram fit') but shows a
curve in plot(experimental_variogram, fitted_variogram) that is expected
(ie. it fits nicely to the points and has a shape as drawn in the books)?

A bonus question: is the message 'warning: singular model in variogram fit'
a warning or a simple message? I tried to use
fitted_variogram - suppressWarnings(fit.variogram(...))
and
fitted_variogram - suppressMessages(fit.variogram(...))
but none of them hides the message. Please let me know how to hide it.

Thanks in advance,
Ákos Bede-Fazekas
Corvinus University of Budapest
Hungary

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo