Abhishek Padhi | 7 Feb 19:07 2016
Picon

spdep not available

Hello Sir,

spdep package is not available for R version 3.2.2.

Thanks

*Regards*
*Abhishek Padhi*
*B.E. (Hons.) Civil Engineering*
Birla Institute of Technology & Science, *Pilani *

	[[alternative HTML version deleted]]
Francesco Perugini | 5 Feb 10:03 2016
Picon

globalG.test - routine with distance based neighbours

Dear all,
I've routine that plot Glogal G statistics at different k-neighbours. I've
tried to run it using distance based neighbours method but I got this
message:

Error in dnb[[val]] : subscript out of bounds

Any suggestion on this?

Thanks a lot.

franc.per

_______________

this is the routine

library(spdep)
example(columbus)
coord <- coordinates(columbus)

#k-nearest neighbours
#z <- c(1:20)
#neighbors.knn <- list()

#Distance-based neighbours
k1 <- knn2nb(knearneigh(coord,k=1,longlat=F))
all.linkedT <- max(unlist(nbdists(k1, coord,longlat=F)))
all.linkedT

(Continue reading)

Antonio Silva | 4 Feb 17:49 2016
Picon

converting shapes to raster

Dear list members

In order to run a niche overlap analysis with nicheOverlap (dismo package)
I need to convert some shapefiles to raster. I dropped a shapefile as
example at https://www.dropbox.com/s/g6o0vfdx7lr34ma/Species1Shape.rar?dl=0

Marine fish abundance data is aggregated in quadrants of 10 x 10 nautical
miles. As I have a coast line some quadrants are cut and also I have a
blank area (land area).

I tried many things like:

shp.frtcost <- readShapePoly("BL10_FRT_COSTEIRA.shp")

plot(shp.frtcost,axes=TRUE,xlim=c(-48,-45),border="grey")

text(coordinates(shp.frtcost),labels=as.character(shp.frtcost <at> data
$DENS),cex=0.7)

r <- raster(extent(shp.frtcost),nrows=10,ncols=15)

# or raster(extent(shp.frtcost),nrows=10,ncols=15,vals=shp.frtcost <at> data
$DENS)

r <- rasterize(shp.frtcost,r,fun="first")

plot(shp.frtcost,axes=TRUE,xlim=c(-48,-45),border="grey")

plot(r,add=TRUE)

(Continue reading)

Picon

Floods' Prediction Approach

Hello everybody. I don't know exactly if this post could be considered an
off-topic, and sorry if so. I would like to exchange some ideas with you
all regarding a problem and, hopefully, someone is able to help.

I have a series of flood maps derived from remote sensing. Each map (13
maps) represent the flooded area in a specific water level height (with
binary values, 1=flooded / 0=not flooded) . I would like "predict" the
flooded areas in intermediate water level heights where we haven't maps,
thus trying to simulate the evolution of the flood inside the floodplain.

I used a pixel-by-pixel logistic regression approach, with the dependent
variable as the flood status (0 or 1) in each flood map and the independent
variable as the correspondent water stage height. It predicted quite well,
considering It's a very simple approach, giving the probabilities of a
given pixel to be flooded at all water stage values from the drought to the
maximum seasonal water level. But I'm getting significative errors yet
regarding my field observations (sensors monitoring the floods scattered in
my study area). I'm calculating the number of days per year some pixels
remains flooded (as I know the water levels on each day in a given year)
and I'm getting up to 120 days of difference between the predicted and
observed days of flood in a given pixel.

As I said before, It's a quite simple approach. I'm realizing that more
sofisticated techniques could perform better. I also have some other
variables to include, like a digital terrain model and rasters of daily
precipitation.

Has anyone some idea, some different approach (e.g. cokriging neural
networks) ? As I'm dealing with a categoric response variable I'm not sure
yet what is possible and what isn't. I'll continue my readings here and
(Continue reading)

Picon

Ordinary kriging variance of the prediction with error structure

Dear r-sig-geo community,

I would like to bring a conceptual question on the implementation of
ordinary kriging in gstat.

I'm trying to account for my measurement error in a OK scheme. I assume
that my sampled vector Y(x_i) is a noisy realisation of S(x_i) (the real
variable), thus:
Y(x_i) = S(x_i) + e_i, where e_i is ~N(0,tau^2).

If that error is assumed to follow a certain set of conditions (unbiased,
uncorrelated between itself/the variable and tau=constant), this is
analogous to the solution of the kriging system with a nugget effect in the
Covariance structure.

I coded the kriging system and its solution. In order to assess if my
implementation is correct I contrasted it with the krige function in gstat.
The predicted value at each point is the same, meaning that I got correctly
the weights of the system.

However, I'm really confused when dealing with the variance in the
prediction. Which should have this form:

Var(S(x_o) | Y) = Var(S(x_o)) - w' * C_oi - mu

w' weights vector
C_oi vector Covariance between predicted and sampled coordinates
mu lagrange multiplier

If my objective would be to predict the signal of the variable S(x_o), the
(Continue reading)

Picon

raster pkg: overwriting the dimensions detected by netcdf4 (PERSIANN-CDR)

Dear spatial community,

While reading some netCDFfiles with the raster package , I got a
"rotated" file, where the columns where read as rows and vice versa:

------------- START ----------------
x <- raster("PERSIANN-CDR_v01r01_20090101_c20140523.nc")
Loading required namespace: ncdf4
plot(x)
x
class       : RasterLayer
dimensions  : 1440, 480, 691200  (nrow, ncol, ncell)
resolution  : 0.25, 0.25  (x, y)
extent      : -60, 60, 0, 360  (xmin, xmax, ymin, ymax)
coord. ref. : NA
data source : /home/hzambran/PERSIANN-CDR_v01r01_20090101_c20140523.nc
names       : NOAA.Climate.Data.Record.of.PERSIANN.CDR.daily.precipitation
z-value     : 2009-01-01
zvar        : precipitation
------------- END ----------------

The previous summary shows that the spatial extent of the file is
60W-60E and 0-360N, while the correct extent should be: 60S, 60N, 0E,
360E.

The file can be downloaded from:
ftp://data.ncdc.noaa.gov/cdr/persiann/files/2009/PERSIANN-CDR_v01r01_20090101_c20140523.nc

After contacting the providers of the file, they mentioned that when
reading the file I have to provide the dimension information, i.e.,
(Continue reading)

Francesco Perugini | 3 Feb 15:08 2016
Picon

globalG.test - plotting from routine

Dear all,
I've a routine (see below) that computes the Global G statistics for
different k-neighbor spatial matrix and for different variables.
Now I want to plot (for each var) the difference between Global G statistic
computed at k neighbour with the one computed at k-1 on the y-axes, with
the number of k on the x-axis.
I guess I need to extract the Global G statistics from the ZG lists and
then plot them.This is my attempt:

library(rlist)
list(ZG)
list.extract(ZG,'var$estimate[1]')

plot(c(var$estimate[1], border="grey60",xlab="k",ylab="Global G statistic",
     xlim=NULL, axes=TRUE)
title(main="Global G at different k")

Any suggestion on this?
Thanks a lot.
franc.per

_____________
this is the routine:

library(spdep)
example(columbus)
coord <- coordinates(columbus)

z <- c(1,2,3,4,5,6,7,8,9)
neighbors.knn <- list()
(Continue reading)

Uzzal | 3 Feb 11:37 2016
Picon

Is there any way to work with non-normal data for Ordinary kriging analysis without doing any kind of data transformation using gstat package?

I have a dataset contains hourly PM10 concentrations for 90 days of 83 sites. My data is not in normal distribution. But I don't want to log-transfer or any other transfer of this data for variogram plotiing as well as ordinary kriging analysis. 

 

Is there any way to deal with non-normal data for kriging analysis in stead of log or any transformation in gstat? If I use 'cressie = TRUE' in 'variogram' functionin gstat in case of working with non-normal data then is it equvalent to work with the data with normal distribution?

 

code :

library(gstat)
seoul050506_var&lt ;-variogram(PM10 ~1, data=seoul050506, width = 2.6, cutoff = 25, cressie = TRUE)

 

Orpheus
 

 






_______________________________________________
R-sig-Geo mailing list
R-sig-Geo <at> r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Francesco Perugini | 2 Feb 16:15 2016
Picon

globalG.test - x vector with different lenght as the neighbours list

Dear all,
I've a routine to create spatial weight matrix for different k and then to
implement, for each of the created spatial weighted matrix, the Global G
test for autocorrelation.
The point is that I'll try to do this test on multiple variables but then
lenghts are different

> length(dlwknn.B$neighbours)
[1] 49
> length(var)
[1] 1

Any help on how to code this?
Thanks a lot.
franc.per90

______________________________
This is the routine:

library(spdep)
example(columbus)
#data(columbus)
#attach(columbus)
coord <- coordinates(columbus)

z <- c(1,2,3,4,5,6,7,8,9)
neighbors.knn <- list()

for (val in z)
  {
  neighbors.knn <- c(neighbors.knn, list(knn2nb(knearneigh(coord, val,
longlat=F), sym=F)))
}

class(neighbors.knn)

class(neighbors.knn[[1]])
plot(neighbors.knn[[1]], coord)

class(neighbors.knn[[2]])
plot(neighbors.knn[[2]], coord)
....

ndists <- length(z)
ZG <- vector(mode="list", length=ndists)
names(ZG) <- as.character(z)

f <- c(CRIME, INC)
for (val in z) {
  dlwknn.B <- nb2listw(neighbors.knn[[val]], style="B", zero.policy=TRUE)
  temp <- list()

  for (var in f)
  {
    temp <- c(temp, list(globalG.test(var, dlwknn.B, zero.policy=F)))
  }
  ZG[[val]] <- temp
}

t(sapply(ZG, function(var) c(var$estimate[1], var$statistic,
p.value=unname(var$p.value))))

	[[alternative HTML version deleted]]
Clément Gorin | 1 Feb 10:36 2016
Picon

SAR Poisson GLM model

Hi,
I am estimating a gravity model of migration on cross-sectional data. The Moran I statistic indicates a
positive and significant spatial autocorrelation in the residuals of the a-spatail model, and the
Lagrange Multiplier test points to the Spatial Autoregressive (SAR) model as the preferred
specification. While I have no issue fitting a linear SAR (Le Sage and Pace 2008) to my data, it does not
accommodate the very large number of zeroes (> 90%) in my dependent variable. This clearly point to a
Poisson process (Santos Silva and Tenreyro 2006). 

In short, I am having trouble running the SAR Poisson GLM. I had two questions:

(1) Is there a method to run a SAR Poisson GLM in R? (I searched a lot before posting here)

(2) If answer to (1) is no, I should at least use a spatially filtered Poisson GLM. Yet, both
SptatialFiltering() and ME() crash even using a very simple connectivity structure (symmetric knn = 5).
I mean that it did not give any message error but RStudio simply "lost the connection with the R session". I
suspect this is due to the large number of observation (278 784). Do you have tips to increase
computational efficiency?

Best,
Clément Gorin
PhD student, GATE LSE
_______________________________________________
R-sig-Geo mailing list
R-sig-Geo <at> r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Tyler Frazier | 31 Jan 20:01 2016

Tearing of polygons using ggmap & readOGR

I'm trying to incorporate a simple shapefile of US states with ggmap and keep getting "tearing" of my
polygons. I tried changing group=id to group=group which seemed to partially solve the problem, but not entirely.

library(rgdal)

states <- readOGR(dsn="shapefiles", layer="states")

proj4string(states)

[1] "+proj=longlat +datum=NAD83 +no_defs +ellps=GRS80 +towgs84=0,0,0"

states <- spTransform(states, CRS("+proj=longlat +datum=WGS84"))

states <- fortify(states)

sstates <- get_map(location = c(-81, 35, -69, 45), zoom = 4, maptype = "watercolor")

sstates <- ggmap(sstates)

sstates <- sstates + geom_polygon(aes(x = long, y = lat, group=group), data = states, color ="white", fill
="orangered4", alpha = .4, size = .2)

Here is an image of the output.

http://pasteboard.co/1d7kiRfw.png <http://pasteboard.co/1d7kiRfw.png>

	[[alternative HTML version deleted]]

Gmane