Re: [R-sig-Geo] Generate the local coefficients in spgwr

2018-05-04 Thread Dexter Locke
The R script here might be helpful: DOI: 

https://doi.org/10.5061/dryad.3vh79

-Dexter 

> On May 4, 2018, at 8:09 PM, Danlin Yu  wrote:
> 
> Alysson:
> 
> I believe if you calibrate your GWR model using the gwr() routine, the 
> resulted gwr object shall have all the elements you need in the SDF element, 
> as detailed in the package's manual:
> 
> The SDF is a SpatialPointsDataFrame (may be gridded) or 
> SpatialPolygonsDataFrame object (see package "sp") with fit.points, weights, 
> GWR coefficient estimates, R-squared, and coefficient standard errors in its 
> "data" slot.
> 
> Once you extract this information, you can output them using generic 
> functions like write.table to output to txt files, csv files and the like.
> 
> Hope this helps.
> 
> Cheers,
> 
> Dr. Danlin Yu
> 
> 
>> On 5/4/2018 5:17 PM, Alysson Luiz Stege wrote:
>> Hello, my name is Alysson.
>> 
>> I am estimating a model by the geographically weighted regression method
>> and using the spgwr package. I would like to know how to generate the local
>> coefficients, the value of the statistic t or the level of significance,
>> the standard deviation and then export them into a txt file, for example.
>> 
>> Thank's
>> 
>> Alysson
>> 
>>[[alternative HTML version deleted]]
>> 
>> ___
>> R-sig-Geo mailing list
>> R-sig-Geo@r-project.org
>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> 
> -- 
> ___
> Danlin Yu, Ph.D.
> Professor of GIS and Urban Geography
> Department of Earth & Environmental Studies
> Montclair State University
> Montclair, NJ, 07043
> Tel: 973-655-4313
> Fax: 973-655-4072
> Office: CELS 314
> Email: y...@mail.montclair.edu
> webpage: csam.montclair.edu/~yu
> 
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[R-sig-Geo] Spatial Panel Models Problem (Splm package)

2018-05-04 Thread felipe tavares
Good evening.

I am trying to estimate a spatial panel data model through splm package.

I am facing the error: Error in lag.listw(listw, u, zero.policy =
zero.policy) :
  object lengths differ

However, my W matrix is NxN, my y vector is NTx1 and my X matrix is NTxK.

My code is:
poly <- readOGR(dsn ="C:/Users/fstavares/OneDrive/Resource Policy Paper",
layer = "RJ")
RJ <- poly2nb(poly)
W <- nb2listw(RJ, zero.policy = TRUE, style = "W", glist = NULL)
SEM <- spml(gdp ~ oivrev, listw = W, model="within", spatial.error="b",
lag=F, data=panel)


Does anyone have faced this problem?

I can send database and code, if it somebody can help me.



-- 
Att,

Felipe Tavares
Bacharel em Ciências Econômicas - UFSCar
Mestrando em Economia Aplicada - ESALQ/USP
Analista Pricing - ALLIED Technology

Telefone:
(011) 97468-0833

e-mail:
ftavare...@gmail.com
f stava...@alliedbrasil.com.br

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Generate the local coefficients in spgwr

2018-05-04 Thread Danlin Yu

Alysson:

I believe if you calibrate your GWR model using the gwr() routine, the 
resulted gwr object shall have all the elements you need in the SDF 
element, as detailed in the package's manual:


The SDF is a SpatialPointsDataFrame (may be gridded) or 
SpatialPolygonsDataFrame object (see package "sp") with fit.points, 
weights, GWR coefficient estimates, R-squared, and coefficient standard 
errors in its "data" slot.


Once you extract this information, you can output them using generic 
functions like write.table to output to txt files, csv files and the like.


Hope this helps.

Cheers,

Dr. Danlin Yu


On 5/4/2018 5:17 PM, Alysson Luiz Stege wrote:

Hello, my name is Alysson.

I am estimating a model by the geographically weighted regression method
and using the spgwr package. I would like to know how to generate the local
coefficients, the value of the statistic t or the level of significance,
the standard deviation and then export them into a txt file, for example.

Thank's

Alysson

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


--
___
Danlin Yu, Ph.D.
Professor of GIS and Urban Geography
Department of Earth & Environmental Studies
Montclair State University
Montclair, NJ, 07043
Tel: 973-655-4313
Fax: 973-655-4072
Office: CELS 314
Email: y...@mail.montclair.edu
webpage: csam.montclair.edu/~yu

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[R-sig-Geo] Generate the local coefficients in spgwr

2018-05-04 Thread Alysson Luiz Stege
Hello, my name is Alysson.

I am estimating a model by the geographically weighted regression method
and using the spgwr package. I would like to know how to generate the local
coefficients, the value of the statistic t or the level of significance,
the standard deviation and then export them into a txt file, for example.

Thank's

Alysson

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[R-sig-Geo] stConstruct returning error message "undefined columns selected"

2018-05-04 Thread Laura Cabral
Hello,

I have been following the R documentation for stConstruct {spacetime} to make 
my own spacetime object from a long table (Data_Sept5). The space ID is stored 
in the Location column (char) and the time is stored in Datetime as POSIXct. 
The data of interest is Prop_Strava, and since only some spacetime values are 
available, this should generate a STIDF object 

> head(Data_Sept5)
   LocationDatetime Count_Mio Edge_ID 
Count_Strava Prop_Strava
1 100 Avenue West of 106 Street 2017-09-05 00:00:00 2   54789   
NA  NA
2 100 Avenue West of 106 Street 2017-09-05 01:00:00 2   54789   
NA  NA
3 100 Avenue West of 106 Street 2017-09-05 02:00:00 0   54789   
NA  NA
4 100 Avenue West of 106 Street 2017-09-05 03:00:00 1   54789   
NA  NA
5 100 Avenue West of 106 Street 2017-09-05 04:00:00 0   54789   
NA  NA
6 100 Avenue West of 106 Street 2017-09-05 05:00:00 1   54789   
NA  NA

The spatial object is a SpatialPointsDataFrame loaded from a shapefile using 
readOGR, called Counters. Counters contains 13 observations of one variable 
(Name) which corresponds to the locations names in Data_Sept5 as shown above.

I used this line to generate the spacetime object, however, it generates the 
error shown below:

ST_Sept5<-stConstruct(Data_Sept5, "Location", "Datetime", Counters, interval = 
T)

Error in `[.data.frame`(x@data, i, j, ..., drop = FALSE) : 
  undefined columns selected

Any idea what might be the cause? As far as I call tell, all my data is 
equivalent to the example given in the stConstruct R documentation. I also 
tried giving the column index instead of the name, but received the same error.

Thank you for your help,

Laura Cabral

 

MSc Candidate in Transportation Engineering

Dept of Civil and Environmental Engineering

University of Alberta

lcab...@ualberta.ca 
819 993-1901


The University of Alberta is located in ᐊᒥᐢᑿᒌᐚᐢᑲᐦᐃᑲᐣ (Amiskwacîwâskahikan) on 
Treaty 6 territory, homeland of the Papaschase and the Métis Nation


[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Issues with a GDB file in R?

2018-05-04 Thread Roger Bivand

On Thu, 3 May 2018, Roger Bivand wrote:


On Thu, 3 May 2018, Aguirre Perez, Roman wrote:


 Hi again,

 first of all, thanks a lot for your comments. It's becoming quite
 interesting how to perform this task with such amount of data.


 Here is an update...

 Cotton, I could read the geodatabase by using the commands I shared.
 However, my R session crashed after overlaying it with a small set of
 polygons. I guess it was because the size of the sp object (around 14.5
 GB). It's also worth mentioning that it was just a "multipolygon" (is that
 the correct word?) formed by 2370829 polygons. I haven’t succeeded on
 installing the sf package, but I will keep trying it.

 Melanie, I already downloaded and read the geoTiff version. At first
 sight, it seems that this object doesn’t have enough features as to
 perform the overlying. I might have a biased idea of how raster objects
 work - I'm too stick to the representation of shapefiles which sounds
 quite related with Roger's idea. So I will start to explore it in order to
 gain a bit more understanding on it.

 Roger, I certainly need to know which category is associated with each CLC
 polygon in order to compute the area covered by each of these classes
 within each NUTS3 region. Therefore, I still need to use a vector-vector
 overlay, right?


On the contrary. The geoTiffs are coded as described in the documentation. 
Your output is simply the count of raster cells by NUTS3 for each of the 
categories. The geoTiff is in ETRS_1989_LAEA, so is projected; it includes a 
lot of sea and no data because of the French overseas territories.


You could possibly use PostGIS to do the intersections, or use GRASS for 
raster-vector counts (say through rgrass7). In R, you would want to add a 
NUTS3 ID band to the land cover raster, then aggregate by NUTS3 ID.


I would suggest using GRASS as the most obvious route, reading the raster, 
reading the NUTS3 boundaries into a separate location, projecting to LAEA, 
then rasterising the NUTS3 regions (v.to.rast) and running r.cross. You get 
32K output categories, the 48 corine categories times the count of NUTS3 
regions minus the nulls (there aren't many glaciers in most regions). You'll 
then need to match back the Corine codes and the NUTS3 codes - see in the 
category label file shown by r.category.


I'll try to provide code tomorrow.


Combining R and GRASS seems to work, but no guarantees.

library(sp)
library(rgdal)
Gi <- GDALinfo("g250_clc12_V18_5.tif")
makeSG <- function(x) {
  stopifnot(class(x) == "GDALobj")
  p4 <- attr(x, "projection")
  gt <- GridTopology(c(x[4]+(x[6]/2), x[5]+(x[7]/2)), c(x[6], x[7]),
c(x[2], x[1]))
  SpatialGrid(gt, CRS(p4))
}
SG <- makeSG(Gi)
nuts_ll <- readOGR("NUTS_RG_01M_2013_4326_LEVL_3.shp")
nuts_laea <- spTransform(nuts_ll, CRS(attr(Gi, "projection")))
library(rgrass7)
td <- tempdir()
iG <- initGRASS("/home/rsb/topics/grass/g740/grass-7.4.0", td, SG)
# your GRASS will be where you installed it
writeVECT(nuts_laea, "nuts", v.in.ogr_flags="o")
execGRASS("r.in.gdal", input="g250_clc12_V18_5.tif", output="corine",
  flags="o")
execGRASS("v.to.rast", input="nuts", output="nuts3", use="cat",
  label_column="FID")
execGRASS("r.cross", input="nuts3,corine", output="cross_nuts3")
r_stats0 <- execGRASS("r.stats", input="cross_nuts3", flags="a",
  intern=TRUE)
r_stats1 <- gsub("\\*", "NA", r_stats0)
con_stats <- textConnection(r_stats1)
stats <- read.table(con_stats, header=FALSE, col.names=c("cross_cat",
  "area"), colClasses=c("integer", "numeric"))
close(con_stats)
r_cats0 <- execGRASS("r.category", map="cross_nuts3", intern=TRUE)
r_cats1 <- gsub(";", "", r_cats0)
r_cats2 <- gsub("\t", " ", r_cats1)
r_cats3 <- gsub("no data", "no_data", r_cats2)
r_cats4 <- gsub("category ", "", r_cats3)
r_cats4[1] <- paste0(r_cats4[1], "NA NA")
r_cats_split <- strsplit(r_cats4, " ")
cats <- data.frame(cross_cat=as.integer(sapply(r_cats_split, "[", 1)),
  nuts=sapply(r_cats_split, "[", 2),
  corine=as.integer(sapply(r_cats_split, "[", 3)))
catstats <- merge(cats, stats, by="cross_cat", all=TRUE)
agg_areas <- tapply(catstats$area, list(catstats$nuts, catstats$corine),
  sum)
library(foreign)
corine_labels <- read.dbf("g250_clc12_V18_5.tif.vat.dbf", as.is=TRUE)
o <- match(colnames(agg_areas), as.character(corine_labels$Value))
colnames(agg_areas) <- corine_labels$LABEL3[o]
agg_areas_df <- as.data.frame(agg_areas)
agg_areas_df1 <- agg_areas_df[-which(!(row.names(agg_areas_df) %in%
  as.character(nuts_ll$FID))),] # dropping "NA"  "no_data"

This should be ready to merge with the NUTS3 boundaries, if needed.

agg_areas_df1$FID <- row.names(agg_areas_df1)
nuts_corine <- merge(nuts_laea, agg_areas_df1, by="FID")

For the vector parts you could use sf and the provisional rgrass7sf on 
github, but that wouldn't yet let you construct a skeleton SpatialGrid to 
define the GRASS location. Using GRASS for the heavy lifting (the raster 
is 51000 by 35000), and avoiding vector for overlay, this doesn't need 
much 

[R-sig-Geo] Spatial correlation in large data sets

2018-05-04 Thread Surya Kr
Dear Fellow Members,

I have a data set consisting of 1000+ variables, each having
spatio-temporal data. I want to access and filter the variables that show
evidence of spatial patterns so that I can model spatial behavior
accordingly.

For a univariate spatial data, one could plot empirical variograms and
visually assess the presence of spatial pattern in the data. I don't have
that luxury with 1000+ variables. What is your suggestion to do this in an
automated fashion? I'm thinking calculating a spatial correlation measure
like Moran's I for each variable and at each time point and assess by
summarizing appropriately. What are my alternatives?

Appreciate your time.
- Surya

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo